Social media platforms have a content problem.
This isn't about the fact that last year they became (hopefully) unwitting shills for foreign governments trying to sway the outcome of our national elections.
That's definitely a major (and potentially existential) threat to social media businesses, but so is a more lingering, and equally as pervasive problem websites like Facebook, Instagram, and Twitter have with violent and pornographic imagery popping up in their feeds.
Three years ago in Wired, Adrian Chen wrote a great piece about the poor souls who work for these billion-dollar companies out of linoleum-tiled offices in the Philippines -- far from the glittering campuses of Silicon Valley or the high tech office towers in San Francisco -- whose job it is to keep dick pics and beheadings from reaching the screen of your preferred internet consumption device.
Well, it's not just an American problem.
In China, where content laws are far more restrictive and penalties for defying government proscriptions on content are far more severe, new social media giants and telecom providers like Musical.ly, Live.me, China Mobile Communications, Thunder, and China Unicom have moved away from employing human censors to deploying a technology from the young Guangzhou-based startup Tuputech.
Launched just three years ago by one of the co-founders of China's wildly successful WeChat messenger service, Tuputech has already amassed a group of customers in China whose combined reach is larger than the entire population of the U.S.
“Interacting with a brand or social site online should be enjoyable. We shouldn’t have to worry about offensive content popping up, “ said Leonard Mingqiang, founder of Tuputech (and a former founding member of WeChat). “Our technology acts fast to detect inappropriate content and help our clients remove it before it enters the online environment."
It's not surprising that any social media company would look to automate as much of the process as possible -- indeed Facebook and others already use their own content management tools in addition to the human censors that regulate newsfeeds -- but in China where the government is far more restrictive of free speech, tools like Tuputech's are a necessity.
The company boasts some staggering numbers. It's software analyzes 900 million images daily and processes 50 images per second identifying pornography or violent images with a 99.5% accuracy rate.
It's that 0.5% that can be problematic for Western companies. In China, there's little more than lip service paid to the right to free speech. In reality there are no protections for free speech and therefore the companies that censor their communities don't face any reprisal or censure from the users they're censoring.
However companies in the U.S. have to deal with customers that are more concerned with the limitations that the services they use put on the types of posts they make.
Facebook users like Didi Delgado, a poet and activist, have explicitly taken the world's largest social network to task for its censorship of certain communities. And the company faced tremendous criticism when ProPublica revealed how the company's rules protect white men.
The purported benefit of a service like Tuputech would be its ability to "learn" with each image it processes. So far, the company has categorized over 100 billion images for its Chinese clients, and it would now like to hoover up data on restricted content from any U.S.-based company who would care to use it.
To use Tuputech, customers link to the company's software using an API. Images are processed on Tuputech's servers and categorized. If an image is clearly objectionable then Tuputech will flag it.
Companies using the service will send discrete images along as they're uploaded to the site... or take random screen grabs at five second intervals if it's a video.
Tuputech has a second manual phase that it will use to send a picture to a client for review if the content is what Leonard described as borderline.
"We have a second phase of manual addition to decide whether the nudity is prohibited or acceptable," Leonard said through a translator. "If it is only a little bit naked. That [acceptability] would be decided in the manual additional phase."
There are some impressive researchers who are working with Leonard on the technology, including Xiao Fei He, the former head of the research institute established by the ride-sharing service Didi.
In China, some companies are relying almost exclusively on Tuputech and have reduced their auditing manpower by as much as 90%. In a release, the company cited its use at Musical.ly as an example. There, the company has eliminated 95% of its human labor efforts to manage uploaded images and videos and today only 1% of their videos now require manual review.
“Our goal is to remove some of the stress and pressure that comes with the day-to-day operations of running a massive online business,” said Leonard in a statement. “We take online image and content review off your plate. A company’s manual review becomes 1% of their time so they can focus on more important things and keep moving their company forward.”
Tuputech has raised $10 million in financing from Chinese investors like Northern Light Venture Capital and Morningside Venture Capital. Its last round closed in April 2016.