Facebook audit panel suggests creating an elected 'parliament' of users to rewrite its rules

Mark Zuckerberg has visited some legislatures, and snubbed many more; should he have his own? - Anadolu
Mark Zuckerberg has visited some legislatures, and snubbed many more; should he have his own? - Anadolu

An independent panel set up by Facebook to scrutinise its own behaviour has said it should consider creating an elected "parliament" of users with the power to rewrite its rules.

In a report released on Thursday, the group of academics said Facebook was suffering from a "crisis of public trust" created by its "top-down" approach to moderating and censoring what its users post on its service.

It criticised the social network's explanations of its moderation decisions, calling them "vague and cursory", and and asked it to consider recruiting randomly-selected "juries" of users who would decide on appeals.

The suggestions go far beyond Facebook's existing plan to build an independent "supreme court" to hear user appeals, and would represent an unprecedented new step along Facebook's path to acting like a government.

"While Facebook has made laudable efforts to advertise the ways in which it enforces community standards, these efforts can be described as part of a top-down project to build a global, private government that is seeking to regulate content among more than 2bn individuals," the panel said.

"Users feel disempowered in their community. They become increasingly 'litigious' concerning the ways in which the standards apply to them. Users also also express worries concerning long-term policies that apply without taking their interests into account."

"They routinely challenge the competence and powers of the 'officials' who apply the community standards. In sum, the network suffers from a crisis of public trust."

The Data Transparency Advisory Group (DTAG) was set up by Facebook last year to scrutinise the way it collects and publishes data on how much harmful content exists on its platform and how the company ends up removing it. Its members include professors from University College London, King's College London and Yale Law School.

Some of their report was positive, judging Facebook's moderation practices were "sensible" and that its ways of measuring harmful content were "reasonable". But it also said it had not been allowed to audit the company's operations in detail, and said its regular "transparency reports" were not granular enough.

Following a recommendation from the panel, Facebook its first ever statistics on appeals and reversed decisions, showing that up to 6pc of the content its moderators take down is eventually restored.

The most contentious category was hate speech, with 28pc of users appealing Facebook's action. Only 12pc of those appeals succeeded, however, compared to 22pc of appeals relating to adult nudity and sex.

The company also stepped up its purges of illegal drugs, taking action on around 900,000 pieces of content in the last six months. It took down 2.2bn fake accounts in the first three months of 2019, a mammoth number considering that Facebook has 2.3bn actual users.

Chief executive Mark Zuckerberg said this was driven by "automated attacks by bad actors" making massive numbers of accounts, most of which were blocked at the moment of creation and which were never counted in Facebook's user numbers.

While broadly approving of Facebook's moderation statistics, the DTAG was far more critical about its legitimacy in the eyes of its users. It compared the relationship between them and Facebook's moderators to the relationship between citizens and police, claiming that its "officials" had become disconnected from the people they serve.

It said that Facebook's explanations to users when their posts are removed failed to specify which rule was broken, let alone why the post was judged to break that rule, and argued that its "appeal" button "does not invite the user to participate or voice their point of view in any way".

Currently, Facebook's rules are written by employees in consultation with outside experts, and debated and then passed or rejected in a fortnightly meeting known as the Content Standards Forum.

Mr Zuckerberg has vowed to also build a "supreme court" with the power to overturn Facebook's rulings, saying he does not believe his company should be making so many decisions about free speech on its own.

In a blog post, Facebook's head of product policy research, Radha Iyengar Plumb, welcomed the DTAG's report, saying some of its recommendations had already been implemented.