Advertisement

Facebook believes it is censoring people exactly the right amount

The company has also admitted that it will make mistakes: Press Association
The company has also admitted that it will make mistakes: Press Association

Facebook has shed more light on how it moderates content on the site, amid heavy criticism.

The company believes it has found the right balance in terms of how much unacceptable content it takes down and how much it leaves up, in order to bring attention to a serious issue.

“We face criticism from people who want more censorship and people who want less,” wrote Monika Bickert, Facebook’s head of global policy management, in a blog post today.

“We see that as a useful signal that we are not leaning too far in any one direction.”

Internal documents leaked over the weekend show that moderators are instructed to delete certain threats but leave others up, depending on how “credible” they are. They were widely considered to be confusing and counterintuitive.

For instance, “Someone shoot Trump” is considered to be a form of abuse that should be deleted straight away.

“Kick a person with red hair” and “I hope someone kills you”, however, can be left up.

“There’s a big difference between general expressions of anger and specific calls for a named individual to be harmed, so we allow the former but don’t permit the latter,” explains Ms Bickert.

“It’s hard to judge the intent behind one post, or the risk implied in another. Someone posts a graphic video of a terrorist attack. Will it inspire people to emulate the violence, or speak out against it? Someone posts a joke about suicide. Are they just being themselves, or is it a cry for help?”

However, she also admits that the site will make mistakes.

“Being as objective as possible is the only way we can be consistent across the world. But we still sometimes end up making the wrong call.”

Facebook is recruiting 3,000 additional members of staff to help moderate the site, but people are skeptical about how much of a difference they’ll make.

“Facebook cannot keep control of its content,” a source told The Guardian. “It has grown too big, too quickly.”

The company also uses algorithms to help its staff spot unacceptable content, but they aren’t watertight.

“We aim to keep our site safe,” says Ms Bickert. “We don’t always share the details of our policies, because we don’t want to encourage people to find workarounds – but we do publish our Community Standards, which set out what is and isn’t allowed on Facebook, and why.

“Our standards change over time. We are in constant dialogue with experts and local organizations, on everything from child safety to terrorism to human rights. Sometimes this means our policies can seem counterintuitive. As the Guardian reported, experts in self-harm advised us that it can be better to leave live videos of self-harm running so that people can be alerted to help, but to take them down afterwards to prevent copycats. When a girl in Georgia, USA, attempted suicide on Facebook Live two weeks ago, her friends were able to notify police, who managed to reach her in time.

“We try hard to stay objective. The cases we review aren’t the easy ones: they are often in a grey area where people disagree. Art and pornography aren’t always easily distinguished, but we’ve found that digitally generated images of nudity are more likely to be pornographic than handmade ones, so our policy reflects that.”