The excitement surrounding Facebook, which caused its share price to plummet by $ 60 billion three years ago and panicked management around founder Mark Zuckerberg, is interesting in several ways. Not only because much of what was boiling up at the time had essentially been known for years. The story is complex.
In addition to quasi-secret services – the mother of the British agency Cambridge Analytica, SCL, which is at the center of the scandal, has developed her tools of psychological warfare in Pakistan and Afghanistan – and motley whistleblowers (Christopher Wylie) were ultra-right-wing pullers (Steve Bannon), alleged scientists ( Aleksandr Kogan) and, one could almost say: as always, alleged Kremlin connections involved in the plot.
He has a weak point for risks and writing about cyber: In his main job security researcher at HiSolutions AG, David Fuhr rages and rages on in this column about current incidents and general truths of information security. In addition to new articles, articles already printed in the iX appear here – always with a tongue-in-cheek update on the current security situation.
ISO Guidelines for Cybersecurity
Let’s save ourselves all political speculation for the moment and try to analyze what happened then soberly with the tools of security. ISO 27032 can be useful for this. The otherwise rather superfluous standard with the title “Guidelines for Cybersecurity” contains a picture that very vividly depicts the most relevant terms, concepts and figures of thought in IT security:
With this methodological background, the Facebook fiasco of that time can be explained as follows: Stakeholders in every commercial enterprise – because Facebook is undoubtedly one of these – are at least owners and customers. So Zuckerberg and his “VCs” (venture capitalists) and all of us? Not quite. Since Facebook is essentially free to use, users (called “members of the community” on Facebook) are not customers!
The business adage “If you don’t pay for the product, you are the product yourself” is particularly true here (as Facebook analyzes all users, shows one Article on Gizmodo). In addition to Zuck & Friends, the key stakeholders are therefore primarily the real customers, i.e. the advertisers, as well as “third parties” such as the users (as a data source and controllable consumer flocks) and possibly other bodies such as regulators, see below.
Which values (assets) need to be protected? For the owners this is clearly the stock market price, for the advertisers their ROI (return on investment). There are several values at stake for users: above all, participation in social (online) life and informational self-determination.
In addition to the usual potential attackers (all types of cyber criminals), other shadowy worlds have long since become visible who want to exert political influence and are not squeamish in their choice of means. Besides SCL, hundreds of other more or less shady companies had been able to pull large amounts of data from Facebook.
In addition to the abstract threat of the “data protection problem” that occurs wherever personal information is processed, a new one has emerged: that of the massive, targeted, difficult-to-notice manipulation of opinions. With Facebook and Co. the question arises as to whether this is actually a bug or a feature, since the whole thing should work that way in the case of advertising on an economic level.
Like all major tech players, Facebook has been investing heavily in security measures for years. Supposedly up to 20,000 employees are involved in security alone (in all its various aspects). With this, many classic problems seemed to be well under control, modulo the usual cat-and-mouse game with the attackers – until a few years ago Facebook had hardly made the headlines with major “breaches”. The new problem of profound personality analysis and influencing voting is, however, a completely different one and is still far from being solved.
Mating season for data octopuses
Now the next escalation of the topic “uncontrollable data octopus” threatens: Contrary to previous assurances, Facebook is in the process of combining the data treasures of the social network and its subsidiary WhatsApp in order to push the evaluation options (and the stock market value) again significantly. This was stopped again, at least for the almost 60 million users in Germany, by the local data protection authorities – with an explicit reference to the risk of political manipulation. But now the decision lies at EU level.
We have to decide how serious we are to defend democratic processes and values - even if we lose “cool” and “useful” tools and gadgets as a result.
As is known, the risk is determined as the product of the possible amount of damage in the worst case and the probability of occurrence. One can confidently assume that systematic attempts to manipulate political decision-making processes will be the rule in the future. The amount of damage, however, is difficult to quantify. Was SCL the decisive factor for Trump or AggregateIQ for Brexit? What is certain is that trust in our democratic processes is only disturbed by skillful attempts at manipulation. And that Facebook, with its targeting, makes it cheap for attackers to bombard the relevant individuals with the optimally perfidious misinformation.
It will not be possible to solve this with a few more security measures. Because the actual weaknesses in the scandal are on a different level: undeserved trust of the Facebook managers in their partners, naivety towards “scientific” methods and actors as well as all of us lack of need for the dangers of data analytics.
A little more self-regulation will not be enough to close Pandora’s box again. The media rightly belong to the critical infrastructures and are suitable for protection because attacks on them can dangerously disrupt coexistence. Facebook and Co. should be measured against it and treated accordingly.