The Observer view on how Facebook’s destructive ethos imperils democracy

Facebook CEO Mark Zuckerberg poses for a selfie with a group of entrepreneurs
Facebook CEO Mark Zuckerberg poses for a selfie with a group of entrepreneurs. Photograph: Jeff Roberson/AP

Facebook likes to present itself as a tech company, but often appears more like an advertising corporation that happens to use digital technology in order to conduct its core business. The personal information and data trails left by its 2 billion users to construct detailed profiles allows advertisers to send precisely calibrated advertisements to people who are likely to be susceptible to, or persuaded by, them.

Although the original intention was to build an automated machine for delivering commercial messages, it rapidly became clear that the technology could also be used for delivering targeted political messages to voters, and this appears to be what happened in both the Brexit referendum and the 2016 US presidential election. What this meant was that Facebook acquired both political power and serious responsibilities.

The revelations in our lead story today are shocking not just because they reveal the extent to which Facebook’s advertising system was exploited for political purposes in the 2016 election, but also because they demonstrate the company’s inability to comprehend the responsibilities that accompany its newfound power.

The revelations show that a data analytics firm was able to harvest the Facebook profiles of about a third of all US Facebook users, which were then used to construct psychological models of those individuals for campaign purposes.

This was no run-of-the-mill cybercrime heist that merely stole credit card details. The information that Facebook holds on its users (at least 98 data points per user) is deeply revealing – including of their tastes, preferences, habits, sexuality, politics, hopes and fears. Academic research has shown that even knowledge of a few “Likes” can reveal an astonishing amount about an individual Facebook user. For political campaigners, this is the purest gold dust, because it enables messages to be precisely calibrated, and for this to be done at a scale that was unimaginable in the pre-internet era.

In a breathtaking piece of corporate casuistry, Facebook claims that this data harvest was not really a data breach at all, because the researcher who opened the floodgates did so “in a legitimate way and through the proper channels”.

The problem, they say, was that the individual in question didn’t abide by the company’s rules because he passed the information on to third parties. A senior Facebook executive told MPs that while the non-breach might have garnered lots of data, “it is not data that we have provided”.

Our revelations also show that by late 2015 Facebook had found out that information had been harvested on an unprecedented scale but failed to take firm measures to deal with the consequences or to notify the affected users of what had happened. This seemingly cavalier indolence provides an ironic counterpoint to the company’s latest insistence that “protecting people’s information is at the heart of everything we do”.

In a way, this kind of casual indifference to the unintended consequences of digital technology is par for the Silicon Valley course – where the mantra of “creative destruction” has the status of religious dogma. And it appears to have been a particular hallmark of Facebook. When suspicions about the exploitation of its systems by political actors (including Russian agencies) first surfaced, the reaction of its founder and CEO, Mark Zuckerberg, was one of hurt denial that his creation could have such malign effects.

Since then, further allegations have been levelled, and he has been obliged to follow in the footsteps of the hero of Mary Shelley’s novel Frankenstein – gradually forced to come to terms with the implications of the monster that he and his employees have created.

The revelations we publish today as part of an award-winning investigation by Carole Cadwalladr should serve as a wake-up call for governments and regulators. Facebook represents a new kind of corporate power, the dimensions of which are only now becoming apparent. The automated machine it built, with the capacity to target individuals with commercial messages, turns out to be exceedingly useful for targeting voters with political messages calibrated to produce political effects – to raise anxiety, reinforce prejudices, suppress turnout, amplify partisanship and increase the reach of misinformation and conspiracy theories.

And at the moment, all this can be done under the radar of the institutions that democracies have created to ensure free and fair elections, control campaign funding and maintain transparency about political advertising.

Shortly after Facebook became a public company, its founder famously exhorted his employees to “move fast and break things”. It was, of course, a hacker’s trope and, as such, touchingly innocent. What perhaps never occurred to Zuckerberg is that liberal democracy might be one of the things they break. It’s time for him – and them – to grow up.