Some time ago, we looked at turning black to white and how decompilation would be used in industrial systems, with our analysis involving discussion of two practical aspects: legacy system algorithm reconstruction and backdoor detection.
We all remember how tiny the decompilation market was until recently and how vague development prospects were. However, the situation has changed in recent years. As a result, today we would like to analyze new developments, describe the current situation and give a forecast for the coming years.
From industrial control system reincarnation to business system reliability
The number of requests to reconstruct legacy SCADA system algorithms or codes has dropped more than twice over the last three years for a number of both economic and technological reasons. Most of these systems are desperately outdated and incompliant with modern business and IT requirements, and are therefore being replaced with up-to-date products. As a result, companies have to reconstruct algorithms of homegrown legacy industrial control systems (ICS), being very peculiar and tailored to specific users, in order to fine-tune new systems.
The growth in the number of backdoor detection requests remained approximately the same, though increasing almost three-fold. This explosive growth coincided with geopolitical instability and increasingly frequent detection of unknown traffic from various information systems. However, we cannot say that most backdoor search requests involve industrial control systems, with many relating to searching backdoors in customized business systems. One of the most prominent cases was a request to detect hidden transmission of customer-related data to a third-party server, which was then quickly blocked by fine-tuning a firewall.
Today: further development drivers
Although the market for decompilation-related services designed to check industrial software for vulnerabilities and backdoors is relatively small, it continues to grow rapidly. We see the following key drivers at play:
High labor-intensity and cost of services
Evolution of an approach to accomplishing decompilation tasks is similar to transition from artwork creation (where result heavily depends on an artist’s talent) to flow production (with key role of maturity and implementation of the technology itself and production processes built around it). In general, decompilation is still a kind of "art", with project success fully depending on expert competences and talent.
The work of experts is seriously complicated by code obfuscation (code meshing, i.e. reducing it to a form when functionality is left intact but algorithm analysis, understanding, and modification during decompilation are hampered). The fact that obfuscation may be of various complexity means experts may spend from just a few minutes to endless hours trudging through the code. The more complicated the obfuscation, the greater the contribution of expert skills to project success. Generally speaking, obfuscation is one of the key factors, since it can increase manifold decompilation cost and time or even render a project economically unfeasible in some cases. However, further technological advancements will likely automate the major share of activities, while more effective and less resource-intensive methods will be developed to solve difficult tasks that cannot be automated. Therefore, we can expect a dramatic reduction (by multiple times in some cases) of labor-intensity and the cost of such services.
Today, skilled and experienced decompilation specialists are usually hired by antivirus labs, restricted access institutes, and government agencies and you can hardly find any of them available on the market to provide a pure and straightforward project service (which is conventional for B2B sector). As a result, interaction with such specialists on a project or service basis is very rare. However, this situation is gradually changing. On the one hand, the number of experts interested to some extent in decompilation technology will grow over time while, on the other hand, those who are now working only at secret labs and institutes will be more and more demanded in commercial companies and will thus increasingly offer their expertise in the market as project services.
Decompilation awareness remains limited
Frankly speaking, very few companies know about services available in this area. In our experience, only a limited number of our customers have a general idea of decompilation technology and even less understand how to use it to solve business tasks. This lack of awareness has a particularly strong impact on technology development: the higher the technological awareness, the greater the demand, thus encouraging increased supply and, in turn, driving competition and technological advancement.
What is beyond the horizon?
In the short term, we expect decompilation technology to become more and more popular and automated. Human experts will still play leading roles and their talent and competences will guarantee efficient decompilation task solving. However, we forecast that, in the next three years, the market will see enterprise-grade solutions with decompilation functionality, in addition to code analysis features. These solutions will be a godsend for those who lack in-depth knowledge in software development, especially decompilation. This will reduce labor-intensity and cost of such activities, allow more specialists to handle simple and medium complexity issues, and boost practice-oriented demand for this technology.
As for SCADA, we believe that the need to reconstruct legacy SCADA system algorithms (or entire code) will eventually cease in several years as it is becoming less reasonable to recover extremely obsolete SCADA systems, with deployment of up-to-date systems being preferred. At the same time, demand for backdoor search tools and services, driven by increased awareness among security specialists, will grow steadily, especially for SCADA systems installed at mission-critical facilities.