Instagram: Tests showing sexual videos recommended to accounts for teens don’t match ‘reality’

Instagram’s parent company dismissed a report Thursday finding that the social media platform regularly recommends sexual videos to accounts for teenagers, arguing that it doesn’t reflect “reality.”

Testing conducted by The Wall Street Journal and an academic researcher over seven months found that accounts for users listed as 13 years old were almost immediately served racy content on Instagram Reels.

When the teen accounts showed interest in racy videos, they were recommended even edgier content, including from adult sex-content creators.

Internal testing and an analysis previously conducted by Meta, the parent company of Instagram and Facebook, reportedly produced similar results, according to the Journal.

Meta spokesperson Andy Stone said in a statement that the testing conducted by the Journal was an “artificial experiment” and suggested that it “doesn’t match the reality of how teens use Instagram.”

“We’re committed to constantly improving and have dedicated teams focused on helping ensure teens see age-appropriate content on Instagram, including when they first join the platform,” Stone said.

“As part of our long-running work on youth issues, we established an effort to further reduce the volume of sensitive content teens might see on Instagram, and have meaningfully reduced these numbers in the past few months,” he added.

Meta announced in January that it was shifting its approach to teen accounts, automatically placing them on the most restrictive content control settings and hiding age-inappropriate content.

Copyright 2024 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

For the latest news, weather, sports, and streaming video, head to The Hill.