Facebook struggles to prevent violence on Facebook Live

Mark Zuckerberg has repeatedly said that live video is the future of Facebook, but what if that future is terrifying and full of violence?

What happens when one of the largest proponents of live video struggles to manage its darker side?

Reports that the sexual assault of a 15-year-old girl was broadcast on Facebook Live and watched by upwards of 40 people earlier this week have rightfully shocked many, and brought to mind a similarly disturbing incident from earlier in the year. 

Individuals posting about acts of violence on the social media platform is nothing new, but since the launch of Facebook Live, the company has faced a particularly difficult challenge: How to best respond to violence on the site when it's happening in real time. 

And if Zuckerberg is correct in his predictions, the scale of the problem is only going to get worse. 

SEE ALSO: Chilling Facebook Live video captures shooting death in Chicago

"Most of the content 10 years ago was text, and then photos, and now it’s quickly becoming videos," he noted at the 2016 Mobile World Congress. "I just think that we’re going to be in a world a few years from now where the vast majority of the content that people consume online will be video."

Is Facebook doomed to play catch up?

With live video charging ahead, how can Facebook identify and stop those who would abuse its streaming service?

Mashable reached out to Facebook directly about this week's sexual assault and its plan to prevent people from livestreaming acts of violence in the future. 

The company's response reiterated its established position on the matter.

“This is a hideous crime and we do not allow this kind of content on Facebook," wrote a Facebook spokesperson. "We take our responsibility to keep people safe on Facebook very seriously and will remove videos that depict sexual assault and are shared to glorify violence.”

Videos aren't the only thing that can be removed — individuals can be banned from the service for posting violent videos that violate its "Community Standards."

This is not the first time we've heard this from Facebook, as past incidents have forced the social media giant to detail how it handles violent streaming content. 

SEE ALSO: A Facebook Live video of torture stayed up for 30 minutes. Why?

"We have a team on-call 24 hours a day, seven days a week, dedicated to responding to these reports immediately," the company stated in a press release from July of last year. "The rules for live video are the same for all the rest of our content. A reviewer can interrupt a live stream if there is a violation of our Community Standards."

Essentially, Facebook relies on people seeing a troubling stream and reporting it to the company — "it only takes one report for something to be reviewed," the release continued. 

Once an offending video has been identified as violating the company's community standards, it can be removed. However, while that can happen during the initial stream, it also may occur only well after the video has gained notoriety and been viewed by many people.

That's clearly a problem. It's estimated that hundreds of hours of video is uploaded to YouTube every minute. If Facebook Live reaches the ubiquity clearly hoped for by Zuckerberg, one imagines an internal Facebook team would run into significant obstacles in properly vetting flagged content. 

Moving forward

Facebook is in a tough spot. The company doesn't want its product used to promote violence, but clearly can't keep every single incident from slipping through the cracks. 

One imagines that a feature like Google's Cloud Video Intelligence API, which allows for the searching of specific objects within a group of videos, could at some point be adapted to screen videos for violence. This would certainly help ease the burden on Facebook's on-call team.

But even if that magical tech solution swoops in to save the day, computers have a hard time with context. The traumatic aftermath of the police shooting of Philando Castile, for example? Facebook likely determined that the video attempted to call attention to violence — not glorify it — and as such it remained on the site. 

With no easy monitoring solution in sight, and Zuckerberg's ambition to grow Live so fundamental to the company, violent videos will likely keep showing up on Facebook. What that means for the service and its 1.23 billion users is unclear, but it does suggest that the future of Facebook could be darker than its founder hoped. 

WATCH: Robotic glove lets people with limited hand mobility perform daily tasks