It’s not yet common to see malicious attacks designed with the intent of destroying industrial equipment, but the proof-of-concept experiments on the vulnerability of industrial control systems only grows more compelling with each example of equipment-destroying malware. The best and most well-known example was Stuxnet, but hackers at the recent Black Hat Security concept showed even more evidence that such attacks can pinpoint weaknesses in nearly any physical system.
At the conference last week, one Honeywell security researcher Marina Krotofil developed an attack on an industrial pump using only a laptop and something we don’t generally consider insidious; bubbles. The Flowserve pump system was injected with a malicious command that sent thick bubbles into the tubes of the system without having any access to the pump itself. As the bubbles flowed through the pump, the vibration sensors picked up a subtle vibration as the bubbles began wearing down the equipment by pitting the metal surfaces and damaging the impellers. Within hours or days, the $50,000 piece of equipment would be severely damaged.
This was all accomplished merely by adjusting the values of a valve further upstream to decrease the pressure in the chamber, allowing bubbles to form. The bubbles then cause “cavitation” by imploding against the pump and transferring the energy to the equipment. According to Krotfil, “They collapse at very high velocity and high frequency, which creates massive shockwaves.”
This demonstration was more than an exploration of the weaknesses of a certain system. It was meant as a warning of upcoming attacks known as “cyberphysical” hacking, where attackers can damage or send machines offline using physics and chain reactions, even if they only have access to a few components of the system.
The real risk here is that, because so many machine failures occur by accident, it’s easy to mask these threat vectors as normal system malfunctions.
How to Defend Against Cyberphysical Attacks?
In addition to taking the usual precautions like firewalls and IT intrusion detection systems, Krotofil recommends tools like anomaly detection and broader machine analytics that include datapoints trending toward danger zones, even if they’re not triggering alarms currently.
While these attacks are still uncommon, they do happen. In 2015, a steel mill in Germany was massively damaged when a furnace was disabled from shutting down, and last year, Industroyer triggered blackouts in Ukraine by attacking Ukrenergo.
Such sabotage might have many motivations. State attacks are happening covertly, disgruntled employees know how to access systems and which components are vulnerable, or saboteurs may simply threaten to break down machines if ransoms are not paid.
For this reason it’s critical to protect the physical aspect of industrial control systems as well as the software components. They must be able to communicate with one another to successfully warn operators of dangers, and IT protections should be in place to ensure that even if hackers gain access to some parts of the equipment there is adequate warning when machines are becoming less efficient, trending away from ideal performance, or experiencing uncommon malfunctions.