Fine Social Media Companies Over Child Pornography, Extremist Material: British Lawmakers

Josh Lowe
Fine Social Media Companies Over Child Pornography, Extremist Material: British Lawmakers

Social media companies such as YouTube should be fined if they fail to remove illegal content on their platforms such as child pornography, an influential committee of British MPs has said.

“Social media companies currently face almost no penalties for failing to remove illegal content,” a report published Monday by the House of Commons Home Affairs Select Committee said.

“We recommend that the government consult on a system of escalating sanctions to include meaningful fines for social media companies which fail to remove illegal content within a strict timeframe,” it continued.

Trending: Meet A. Jay Chapman, Whose Lethal Injection Recipe Has Helped Kill 1,200 People

The report slammed companies for “outsourcing” some of the monitoring of their platforms to taxpayer-funded bodies: “In the U.K., the Metropolitan Police’s Counter Terrorism Internet Referral Unit (CTIRU) monitors social media companies for terrorist material,” the report said.

“That means that multi-billion pound companies like Google, Facebook and Twitter are expecting the taxpayer to bear the costs of keeping their platforms and brand reputations clean of extremism,” it added.

The report also tackles the failure by social media companies to remove t errorist and extremist material.

“The weakness and delays in Google’s response to our reports of illegal neo-Nazi propaganda on YouTube were dreadful,” it said, “Despite us consistently reporting the presence of videos promoting National Action, a proscribed far-right group, examples of this material can still be found simply by searching for the name of that organisation.”

Don't miss: North Korea Targets Israel: Countries Threatened By Kim Jong Un’s Regime

“We recommend that all social media companies introduce clear and well-funded arrangements for proactively identifying and removing illegal content—particularly dangerous terrorist content or material related to online child abuse,” the report concluded.

A YouTube spokesperson told Newsweek via email: “We take this issue very seriously. We’ve recently tightened our advertising policies and enforcement; made algorithmic updates; and are expanding our partnerships with specialist organisations working in this field. We’ll continue to work hard to tackle these challenging and complex problems."

Nick Pickles, Twitter's U.K. Head of Public Policy, said via email: “Our Rules clearly stipulate that we do not tolerate hateful conduct and abuse on Twitter. As well as taking action on accounts when they're reported to us by users, we've significantly expanded the scale of our efforts across a number of key areas.

“From introducing a range of brand new tools to combat abuse, to expanding and retraining our support teams, we're moving at pace and tracking our progress in real-time. We're also investing heavily in our technology in order to remove accounts who deliberately misuse our platform for the sole purpose of abusing or harassing others. It's important to note this is an ongoing process as we listen to the direct feedback of our users and move quickly in the pursuit of our mission to improve Twitter for everyone,” he continued.

Most popular: How We Can Help Holocaust Survivors Get Back Their Property

Newsweek has contacted Facebook for a comment.

The inquiry into hate speech online was prompted by the murder of Jo Cox, the Labour Party MP killed by a far-right extremist called Thomas Mair during the Brexit campaign.

The committee also heard evidence relating to Islamophobia, misogyny, far-right extremism, the role of social media in hate crime and the particular issues faced by members of parliament in relation to hate crime and its violent manifestations.

The report did note some positive behavior by social networks: “We welcome the fact that YouTube, Facebook and Twitter all have clear community standards that go beyond the requirements of the law,” it said.

“We strongly welcome the commitment that all three social media companies have to removing hate speech or graphically violent content, and their acceptance of their social responsibility towards their users and towards wider communities.”

More from Newsweek

By using Yahoo you agree that Yahoo and partners may use Cookies for personalisation and other purposes