Internet firms must do more to tackle online extremism, says No 10

Facebook
The prime minister’s spokesman would not say whether the government would legislate if social media companies failed to tighten their procedures. Photograph: Karen Bleier/AFP/Getty Images

Downing Street has called for social media companies to do more to expunge extremist material from the internet.

The prime minister’s spokesman said firms such as Facebook and Google “can and must do more” to remove inflammatory material from the web and that it was up to them to respond to public concern.

“Social media companies have a responsibility when it comes to making sure this material is not disseminated and we have been clear repeatedly that we think that they can and must do more,” the spokesman told journalists.

“We are always talking with them on how to achieve that. The ball is now in their court. We will see how they respond.”

The London terror attack has reignited concern about the easy availability of material promoting violent extremism online, although No 10 said on Friday it was making a general point and that it was not not necessarily saying online material was a factor in the radicalisation of Khalid Masood, the terrorist who mowed down pedestrians on Westminster Bridge before stabbing a police officer guarding the gates of parliament.

The prime minister’s spokesman would not be drawn on whether the government would legislate if social media companies failed to tighten their procedures, or on what a new law might involve. He said the priority was to stop radical material appearing online in the first place, and that “when it does appear, we want it to be taken down as quickly as possible”.

The fight against terrorism and hate speech had to be a joint one, he said. “The government and security services are doing everything they can and it is clear that social media companies can and must do more.”

Speaking in New York this week, Boris Johnson, the foreign secretary, said extremist material online was “corrupting and polluting” many people.

“I do think the responsibility for this most lies with the internet providers, with those that are responsible for great social media companies. They have got to look at the stuff that is going up on their sites. They have got to take steps to invigilate it and to take it down where they can,” Johnson said.

The House of Commons home affairs committee is considering this issue as part of its inquiry into hate crime. After a recent hearing with Google, Facebook and Twitter, Yvette Cooper, the committee’s chair, offered a withering assessment of their record tackling extremist material.

She said YouTube’s enforcement of its community standards was “a joke” and that Twitter and Facebook were too slow to deal with hate-filled content. She said: “These are incredibly powerful organisations. They are able to analyse algorithms and behaviour in a sophisticated way in order to target potential consumers with adverts. It’s time they used more of that power, money and technology to deal with hate crime and to keep people safe.”

Johnson also called for a debate about whether internet companies should publish pictures and video of terrorist attacks as they are taking place.

Asked whether Downing Street backed this idea, the prime minister’s spokesman said on Friday that what was important was that “when incidents like this happen, people’s first thought should be getting any footage that can be helpful to the police”.

In a statement, Facebook told the Guardian: ““There is absolutely no place for terrorist groups on Facebook and we do not allow content that promotes terrorism on our platform.

“Whenever we are made aware of this kind of content, we take swift action to remove it from Facebook and work with law enforcement and security agencies as appropriate. We take this responsibility very seriously and continue to work with the government to explore what more can be done to tackle extremism online.”

Twitter and Google declined to comment on Number 10’s remarks.