'It's alarming': Leaked Facebook guidelines allow livestreaming suicide videos but ban nudity

Facebook’s publication guidelines on violence, hate speech, terrorism, pornography and suicide have been revealed for the first time.

The information was leaked to The Guardian newspaper, which has seen more than 100 internal training documents which guide Facebook editors on what its 2 billion users can and can’t publish.

The dossier reveals that Facebook will allow users to livestream suicide attempts. Facebook said it ‘doesn’t want to censor or punish people in distress who are attempting suicide’, according to the documents.

However, the footage will be taken down ‘once there’s no longer an opportunity to help the person’, the social media giant said.

MORE: Facebook fined €110m for misleading EU regulators over WhatsApp deal
MORE: Facebook adverts aim to help users spot fake news

The Guardian’s ‘Facebook Files’ are the first look at the policies and rules which define what is published on the social network.

Sources told the newspaper that Facebook moderators are overworked and confused about the policies – they often have ‘just ten seconds’ to make a publication decision.

One source told the newspaper: ‘Facebook cannot keep control of its content. It has grown too big, too quickly.’

Facebook reviews more than 6.5 million reports relating to potentially fake accounts each week, the dossier said.

Among the guidelines issued to moderators are the following:

– A post such as ‘Someone shoot Trump’ should be deleted, because the US president is a head of state in a protected category. However, a user may post, ‘To snap a bitch’s neck, make sure to apply all the pressure to the middle of her throat’, or ‘I hope someone kills you’, because they are not deemed credible threats.

– Videos of violent deaths don’t always have to be deleted because they raise awareness of issues such as mental illness.

– Photos of physical abuse and bullying may not have to be taken down unless they are deemed sadistic.

– Photos of animal abuse may be shared and will only be marked ‘disturbing’ if extremely upsetting.

– Videos of abortion are permitted as long as there is no nudity.

– ‘Handmade’ art showing nudity and sexual activity is allowed but digitally made art showing sexual activity is not.

– Users with more than 100,000 followers are deemed public figures, meaning they lack the same protections given to private individuals.

Facebook's publication guidelines have been revealed (Picture: Rex)
Facebook’s publication guidelines have been revealed (Picture: Rex)

The Guardian said concern about self-harm and suicide had increased in recent months. In one two-week period this year, there had been 5,431 reports of self-harm on Facebook.

In a policy document for moderators, Facebook said: ‘We’re now seeing more video content – including suicides – shared on Facebook.

‘We don’t want to censor or punish people in distress who are attempting suicide. Experts have told us what’s best for these people’s safety is to let them livestream as long as they are engaging with viewers.’

‘However, because of the contagion risk, what’s best for the safety of people watching these videos is for us to remove them once there’s no longer an opportunity to help the person.’

Earlier this month, Facebook chief executive Mark Zuckerberg announced it would recruit another 3,000 moderators worldwide, adding to its existing moderation team of 4,500.

Children’s charity the NSPCC described the contents of the Facebook leak as ‘alarming to say the least’.

A spokesman said: ‘It needs to do more than hire an extra 3,000 moderators.

‘Facebook, and other social media companies, need to be independently regulated and fined when they fail to keep children safe.’

Facebook founder Mark Zuckerberg announced 3,000 new moderators this month (Picture: Rex)
Facebook founder Mark Zuckerberg announced 3,000 new moderators this month (Picture: Rex)

Conservative MP Tim Loughton said: ‘This appears to show that Facebook’s control over content on its platform is in complete chaos.

‘Clearly, they need to clarify the rules of what is “in” or what is “out” and make sure they have sufficient moderator capacity to implement it transparently and fairly.’

Monika Bickert, head of global policy management at Facebook, said: ‘Keeping people on Facebook safe is the most important thing we do.

‘In addition to investing in more people, we’re also building better tools to keep our community safe.

‘We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.’