Advertisement

Facebook responded to the leak of its moderator rules: 'We get things wrong' (FB)

mark zuckerberg facebook ceo
mark zuckerberg facebook ceo

Justin Sullivan/Getty Images

  • Facebook's moderation rules have leaked, revealing what is and isn't allowed on the social network.

  • A Facebook exec has written a column defending the company, admitting: "We get things wrong."

Facebook's moderation policies have been in the spotlight this week, after The Guardian published leaked documents detailing how the social network decides what is and isn't acceptable on its platform.

The statement "to snap a b---h's neck make sure to apply all your pressure to the middle of her throat" can be permissible, for example. "Someone shoot Trump," on the other hand? Not allowed.

The company's head of global policy management, Monika Bickert, has now responded to the leaks with a lengthy column defending its practices — also published in The Guardian.

From the offset, Bickert strikes a conciliatory tone, lauding The Guardian's coverage (its "reporting on how Facebook deals with difficult issues/images such as this gets a lot of things right"), and defending particular issues that Facebook has been criticised for following the reports.

For example, the social network does not take down livestreaming videos of people attempting to self-harm. This is because, she says, "experts in self-harm advised us that it can be better to leave live videos of self-harm running so that people can be alerted to help, but to take them down afterwards to prevent copycats ... When a girl in Georgia, USA, attempted suicide on Facebook Live two weeks ago, her friends were able to notify police, who managed to reach her in time. We are aware of at least another half-dozen cases like this from the past few months."

vietnam war nick ut napalm girl photo pullitzer
vietnam war nick ut napalm girl photo pullitzer

AP Photo/Nick UtShe also defends not publishing Facebook's moderation policies in detail, as some (including this author) have called for, saying the company doesn't "want to encourage people to find workarounds."

Facebook's standards for moderation have previously attracted heavy criticism. In 2016, it censored iconic Vietnam War photo "The Terror of War," and censured Aftenposten, Norway's biggest newspaper, for publishing it. It has also banned a photo of a Renaissance-era Italian statue for being "sexually explicit. and suspended users who posted a photo of Aboriginal women in traditional dress, among other examples.

Rickert acknowledges mistakes have been made before: "We get things wrong, and we’re constantly working to make sure that happens less often. We put a lot of detailed thought into trying to find right answers, even when there aren’t any.

"I hope that readers will understand that we take our role extremely seriously."

Earlier this month, Facebook announced it is hiring 3,000 extra reviewers, a fact Rickert reiterates in her column. The hires come after a spate of murders, accidental deaths and suicides have been streamed on Facebook Live, its live video feature.

She signs off by arguing that Facebook and broader society is still trying to work out what is "acceptable," but that Facebook is trying as best it can: "Technology has given more people more power to communicate more widely than ever before. We believe the benefits of sharing far outweigh the risks. But we also recognise that society is still figuring out what is acceptable and what is harmful, and that we, at Facebook, can play an important part of that conversation."

NOW WATCH: 15 things you didn't know your iPhone headphones could do

See Also:

SEE ALSO: Facebook's leaked moderation rules show the company desperately needs to be more transparent