Advertisement

Facebook blames pandemic for failure to remove millions of child abuse images from its platforms

The social network company revealed it took action on significantly less harmful material containing such content between April and June saying that it had fewer working moderators because of the pandemic - Richard Drew/AP
The social network company revealed it took action on significantly less harmful material containing such content between April and June saying that it had fewer working moderators because of the pandemic - Richard Drew/AP
Coronavirus Article Bar with counter
Coronavirus Article Bar with counter

Facebook has blamed the coronavirus pandemic for failure to remove millions of child abuse and self-harm images from its platforms.

The social network revealed it took action on significantly less material containing such content between April and June, saying it had fewer working moderators because of the virus crisis.

Facebook's latest community standards report shows 911,000 pieces of content related to suicide and self-injury underwent action within the three-month period, versus 1.7 million pieces looked at in the previous quarter.

Meanwhile on Instagram, steps were taken against 275,000 posts compared with 1.3 million before.

Action on media featuring child nudity and sexual exploitation also fell on Instagram, from one million posts to 479,400.

Facebook estimates that less than 0.05 per cent of views were of content that violated its standards against suicide and self-injury.

"Today's report shows the impact of Covid-19 on our content moderation and demonstrates that, while our technology for identifying and removing violating content is improving, there will continue to be areas where we rely on people to both review content and train our technology," the company said.

"With fewer content reviewers, we took action on fewer pieces of content on both Facebook and Instagram for suicide and self-injury, and child nudity and sexual exploitation on Instagram.

"Despite these decreases, we prioritised and took action on the most harmful content within these categories. Our focus remains on finding and removing this content while increasing reviewer capacity as quickly and as safely as possible."

Children's charity the NSPCC said Facebook's "inability to act against harmful content on their platforms is inexcusable".