Facebook is too lenient on those peddling hate speech | Jessica Valenti

Revealed: Facebook exposed identities of moderators to suspected terrorists

There’s a quote from Maya Angelou that I’m fond of: “When people show you who they are, believe them.” It’s a sentiment I think about quite a lot when I’m combing through harassment or threats online. Because you do want to believe that the person who calls you a “bitch” or who says you should be raped isn’t really like this – that they’re actually good people having a bad day, or a bad life. You want to see the best in other people.

But the truth is that who we are is very much about how we treat others – whether it’s on the street, in our homes or, yes, on the internet. That’s why I was so concerned to see the broad latitude given to online abusers in Facebook’s guidelines for dealing with harassment and hate speech. Their baseline approach appears to give harassers the benefit of the doubt at every turn.

For example, it’s perfectly allowable for someone on Facebook to write: “To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat,” because it’s not an example of “credible” violence that is a “call to action” – just a venting of frustration, they say.

Similarly, if someone were to send the message, “unless you stop bitching I’ll have to cut your tongue out,” it would be classified as an “aspirational” or “conditional” statement. So this direct threat would be permitted on the site.

Why would Facebook believe that this kind of abuse is not a real threat to people? Well, because it’s online. In one of the documents obtained by the Guardian, the social media giant notes, “people use violent language to express frustration online,” and that harassers actually feel “indifferent towards the person they are making the threats about because of the lack of empathy created by communication via devices as opposed to face to face”.

That’s quite a dangerous leap: just because someone might not threaten a person to their face in the same way they would online, it doesn’t mean that threat or hate is any less real.

Before Elliot Rodger went on a killing spree in Isla Vista, California, in 2014, he made YouTube videos in which he complained that women didn’t like him and said he would “slaughter every single spoiled, stuck-up, blond slut I see”. After George Sodini killed three women in 2009 by shooting up a women-only gym class, police found multiple misogynist postings online.

In the wake of Donald Trump’s election, hate crimes have spiked and college campuses are trying to deal with an influx of white supremacist activity. Many of these groups, and their leaders, developed their following online.

And just last week, a New Hampshire state lawmaker resigned after the Daily Beast published a story showing that Representative Robert Fisher was actually the creator of a violently misogynist Reddit community. Online, Fisher admitted taping sexual encounters with women without their knowledge, and argued in favor of taking away women’s autonomy and freedom.

People like this do not feel “indifferent” towards the targets of their ire simply because they’re on the internet. They feel hatred, they feel rage. We have no way of knowing – not really – who will end up being a real danger to people. But we don’t need to give them space and attention, and we certainly shouldn’t give them permission to spread their bile.