The Triton malware attack was far from the first time that hackers have attempted to target the networks of an industrial facility, but it was the first time that malware designed to attack safety systems was ever seen in the wild.
The malware was designed to manipulate Schneider Electric's Triconex Safety Instrumented System (SIS) controllers – emergency shutdown systems – and was uncovered on the network at a critical infrastructure operator in the Middle East.
The malware campaign was extremely stealthy and was only uncovered because the attackers made a mistake and triggered the safety system, shutting down the plant. The outcome could've been much worse.
"We can speculate that their mission is of some physical consequence. They wanted to either stop production at this facility, stop things from working or potentially cause physical harm," says Dan Caban, incident response manager at FireEye's Mandiant.
Speaking during a session on Triton at the National Cyber Security Centre's CYBERUK 19 conference, Caban argued that it was fortunate the malware was uncovered, alerting the world to dangerous cyberattacks that can alter or damage physical systems.
"We were very lucky that this accident happened, it opened the door for people to start thinking about this physical consequence which may have cybersecurity origins – that's how this investigation kicked off and now so much has come to public light," he says.
Following the initial point of compromise, the malware was able to use techniques such as harvesting credentials and moved across the network to reach the SIS controllers.
However, Triton was only able to reach its goal because of some lax attitudes to security throughout the facility: the safety controllers should have been disconnected from the network but were connected to internet-facing operational systems, allowing attackers to gain access.
Other failures -- like a key being left inside a machine -- provided attackers with access they should never have gained without physically being inside the facility.
While the malware has the potential to be highly damaging to valves, switches and sensors in an industrial environment, the threat can be countered by implementing some relatively simple cybersecurity techniques that make movement between systems almost impossible.
"Network segregation can help you avoid this happening. You should be separating them logically, but also based on criticality and by following industry best practice and industry standards," Caban explains. "You should also consider directional gateways so it's not possible to move certain ways."
Organisations can also take a step towards this by ensuring there's proper management around cybersecurity and that there's plenty of information around systems for staff of all levels to understand what's going on – and what to do if something goes wrong.
"In a cyber context, it's absolutely essential that you have governance; leadership from the very top level. Without proper governance in your organisation, you're probably setting up for failure," says Victor Lough, head of UK business at Schneider Electric.
"For cybersecurity, you must consider the physical safety because you're considering kinetic systems. And on the flip-side of that, physical safety must always consider cybersecurity, so they're opposite sides of the same coin – without security we have no safety," he says.
There was once a time when the security of cyber systems and the security of physical systems might have been able to be considered separately, but not any more: in many cases, they're now one and the same.
"This is the blending of the cyber and the physical security – the things you can put bollards around. You kind of could have in this case - they left the key in and left it in programme mode," said Deborah Petterson, deputy director for critical national infrastructure at the UK's NCSC.
In this incident, realising that the key had been left in the machine would have gone a long way to preventing hackers from gaining access to conduct malicious activity.
"People knowing where their safety systems are and how they're connected – it's really basic," she said, suggesting that those running these systems should regularly be examining how the networks operate and should keep logs about updates – especially about dated systems like the industrial facility was running on.
"The one in this example was 15 years old – when was the last time you looked at risk management around that? The churn in security people is one to two years with CISOs. When was the last time you dusted off and used this as a point to go and have a look?" Petterson asked.
Triton targeted critical infrastructure in the Middle East, but there are lessons from the incident that can be applied to organisations in every sector, no matter where they are in the world.
"If you take this out of the context of safety systems, you can apply almost all of them to any enterprise system. They're the same sort of controls we just ask any business to do to make themselves cyber safe," says Dr Ian Levy, technical director at the NCSC.
The hacking group behind Triton – which has has been linked to Russia – remains active, with researchers at FireEye recently disclosing a new campaign targeting a fresh critical infrastructure facility.
However, with the tactics of the group now in the public eye, it's possible to detect and protect against malicious activity.
"All these backdoors, lateral movement techniques and credential harvesting: they can be detected, it's possible, we don't have to give up hope," said FireEye's Caban.
"They can be detected in IT, detected between the IT and OT DMZ – those are easy places to start looking."
MORE ON CYBERSECURITY
- Ransomware: The key lesson Maersk learned from battling the NotPetya attack
- Lights out: How Crash Override hits power grids – hard CNET
- Cybersecurity: This free tool lets you test your hacker defences
- Why Triton malware will 'change the game' of cyberwarfare TechRepublic
- Why is it so hard for us to pay attention to cybersecurity?