Advertisement

Facebook gives special protections to racist pages and allows extreme content to be shared, investigation shows

AFP/Getty
AFP/Getty

Facebook gives special protections to Tommy Robinson and allows people to racially abuse immigrants, according to a new report.

Graphic images and videos of children, violent hate speech and racist content are not immediately or automatically removed from the site, according to footage taken by Channel 4’s Dispatches.

An undercover reporter filmed the people who review content to decide whether it should stay up to be viewed by the public, gaining an unprecedented insight into what is allowed to be posted on the platform.

And it shows that a variety of content remains up on the site, even once it is reported – not only hate speech, but videos depicting child abuse, violent content and self-harm, even if the person shown is suspected of being a child.

It also shows that far-right and racist content is given special protections that stop it being deleted quite so easily. Trainees are shown being told that content that racially abused protected ethnic or religious groups would be removed – but if that abuse is limited to immigrants from those groups, the posts would stay up.

In the footage, moderators are shown explaining that a post targeting Muslims with racist language would be removed, for instance. But if the posts specifically targeted Muslim immigrants, then that could be allowed to stay up because it is a political statement, Facebook has suggested.

It also shows that hate speech and violent content that is slightly more abstract will be allowed. Trainee moderators are shown a cartoon that seems to depict a child being drowned because she is attracted to a “negro” and told that it would probably not be removed from the site, though Facebook has since said that such content violates its hate speech standards.

The findings come as Facebook attempts to clean up its image in the wake of a variety of scandals. As well as the Cambridge Analytica controversy – in which Facebook gave up user data that was then analysed for targeted political advertising – the site has committed to improve the way it screens what content is available on its platform.

The new footage shows the people doing that work, inside Facebook’s largest centre for UK content moderation. Much of that work has been outsourced to other companies, and the Dispatches report centres on one of those external companies.

It shows the training process that those moderators go through. They are shown how to check videos that are reported using Facebook’s tools – which allow users to flag videos that depict graphic violence, child abuse, or other extreme content.

Facebook aims to review all of those reports within 24 hours. But the footage shows that target is often not achieved, with sensitive content such as suicide threats or self-harm potentially going unchecked for days.

Much of the content reported on Facebook is left online even if it depicts violence or other extreme behaviour. And for the most part those reports are not passed onto the police, unless the video is streamed live, Dispatches claimed.

Facebook pages including ones devoted to Tommy Robinson and Britain First were given special protections usually afforded to governments and news organisations. That “shielded” status means that problem content won’t simply be deleted, but will be forwarded on to full-time Facebook employees who will review the content.

Those pages have vast number of followers – Mr Robinson’s page has over 900,000 fans – and moderators speculate that the page is being left up because “they have a lot of followers so they’re generating a lot of revenue for Facebook”.

Facebook admitted that those special protections are in place on Tommy Robinson’s page, and that they had been for Britain First before it was taken down. But it said that the decision was made to ensure that the site did not wrongly take down legitimate political content.

“If the content is indeed violating it will go. I want to be clear this is not a discussion about money, this is a discussion about political speech,” said Richard Allen, Facebook’s vice president of global policy solutions. “People are debating very sensitive issues on Facebook, including issues like immigration. And that political debate can be entirely legitimate.

“I do think having extra reviewers on that when the debate is taking place absolutely makes sense and I think people would expect us to be careful and cautious before we take down their political speech.”

In the documentary, early Facebook investor Roger McNamee claims that the company relies on that extreme content to ensure that its users stay engaged with the site.

“It’s the really extreme, really dangerous form of content that attracts the most highly engaged people on the platform,” he said. “Facebook understood that it was desirable to have people spend more time on site if you’re going to have an advertising based business, you need them to see the ads so you want them to spend more time on the site.

“Facebook has learned that the people on the extremes are the really valuable ones because one person on either extreme can often provoke 50 or 100 other people and so they want as much extreme content as they can get.”

Those accusations were backed by senior politicians. “These revelations about Facebook’s content moderation are alarming, but not surprising. Our Committee have been pressing Facebook for months for information during our fake news inquiry on how it deals with extreme content and stamps out fake news, but they don’t behave like a company who welcome independent scrutiny,” said Julian Knight MP, member of the Digital, Culture, Media and Sport Committee.

“Facebook has recently committed to reducing fake news and improving privacy on its platform, which is welcome. But they don’t seem as committed to sacrificing profits made from extreme content as is demonstrated by Channel 4’s investigation.”

Mr Allen denied accusations that the company relies on such extreme content for its revenues.

“Shocking content does not make us more money, that’s just a misunderstanding of how the system works,” he told Dispatches. “People come to Facebook for a safe secure experience to share content with their family and friends.”

The full documentary – Facebook: Secrets of the Social Network – will be shown on Channel 4 at 9pm on 17 July.

Facebook admitted that the footage showed behaviour that was not up to its standards. It has made a number of changes since it was made aware of the allegations to ensure that the problems are fixed, it said.

“It’s clear that some of what is shown in the program does not reflect Facebook’s policies or values, and falls short of the high standards we expect,” said Richard Allen, Facebook’s vice president of global policy solutions, in a statement provided to The Independent.

“We take these mistakes in some of our training processes and enforcement incredibly seriously and are grateful to the journalists who brought them to our attention. Where we know we have made mistakes, we have taken action immediately. We are providing additional training and are working to understand exactly what happened so we can rectify it.”