Posted by Martijn Grooten on Jul 7, 2017
A week and a half after the outbreak of (Not)Petya, we are still not 100% certain about the motivation of the attackers. Was it a (failed) attempt to extort money from a large number of organizations? Did a nation-state-affiliated group intend to take an armed conflict into the cyber realm by causing as much damage as possible? Or is all the confusion and uncertainty about what really happened, and for what reason, the actual goal of the attack?
This latter option is a real possibility and makes it even more important for security experts to respond calmly and not to further spread fear, uncertainty and doubt.
If it were indeed the case that (Not)Petya intended to cause confusion and uncertainty, it wouldn't be unique in its goals. Much has been written about the alleged 'hacking' of last year's US presidential election. The consensus among experts is that hack attempts did indeed take place, but that these attempts were unsuccessful in that they did not actually change any counted votes.
This may sound like a good thing, and it is of course better than the alternative, but what if a (secondary) goal of the hacking was to undermine trust in the elections? We tend to focus on the technical aspects of election hacking, which are both interesting and important, but the mere idea that this hacking may take place is just as important an aspect of it, if not a more important one.
We should thus make it clear in our analyses of election technology and their hacks whether there is any evidence that weaknesses and hacks did lead to vote tampering and point out that, no matter how vulnerable some systems are, actually changing results is often still extremely difficult. We may also want to point out that the perception that electronic voting can be hacked would be a good enough reason to avoid this kind of voting altogether, no matter how good we might become at finding technical solutions.
A different kind of threat, where the goal of spreading uncertainty is even clearer, is that of repressive governments listening in on the communications of citizens that disagree with their policies, both through means of mass surveillance and through more targeted hacks. There are, unfortunately, many known cases of this, with that of Ahmed Mansoor possibly the best known: the UAE-based lawyer was targeted by his government via iPhone malware that allegedly cost US$1 million to purchase. (Ahmed did not fall for the spear-phishing attempts that would have led to the malware infection, but he is currently behind bars for a Tweet he sent.)
The goal of this kind of espionage and surveillance is rarely just to find out what is being discussed, but also and perhaps more importantly to impose the feeling that anything that is said is being listened to, thus forcing people to apply self-censorship.
It is important to report on these kinds of threats, which are very real, but we should do this in a calm and fact-based matter, not making the threat seem greater than it is (in a recent BBC radio documentary on the subject it was implied that governments could break any encryption, which is certainly not true) and making it clear that there are many tools and behaviours that do work to keep people safe. If we fail to do this and just report these as yet more scary threats, we may actually help these governments.
As for (Not)Petya, this was a real threat, and a big one at that, which did some serious damage. At the same time, we should avoid confusing a number of exceptionally badly hit companies for a global meltdown, which this clearly wasn't. Our increasing reliance on IT infrastructure has simply made companies and organizations more vulnerable to failures in that infrastructure, whether they are caused by an attack or by a power cut.
Even in Ukraine, only one in ten computers within government organizations and businesses were affected. While that is a high rate, it hasn't crippled the country. Indeed, with MEDoc's servers being taken offline as part of the investigations, Ukrainian companies are now sharing older, non-backdoored versions of the software via Google Drive, so that people can continue to do their jobs.
(Not)Petya has taught us many lessons, the most important of which is probably that software written by a small family business can, if widely used, become part of critical infrastructure. It is probably wise to focus on those lessons and, implicitly, thank the attackers for a valuable if costly exercise in incident response, rather than find as many horror stories as we can of businesses that are still battling the malware.
Cyber threats can be very confusing, and the fear, uncertainty and doubt they may cause are very understandable. Security experts should not feed into these feelings but rather respond calmly and reassuringly. This applies to all kinds of attacks, but even more so in the very real cases where this FUD is exactly what the attackers are aiming for.