However extreme your views, you’re never hardcore enough for YouTube

John Naughton
Users looking for video information on the Chemnitz riots were led to extremist content. Photograph: Odd Andersen/AFP/Getty Images

Early one Sunday morning a month ago, a German carpenter was fatally stabbed in a street fight in Chemnitz in eastern Germany. Little is known about how the brawl started, but rumours rapidly circulated online that the man was defending a woman from sexual assault. Within hours of his death, rumours that his killers were two refugees triggered a violent reaction. For two nights running, thousands of rightwing extremists and sympathisers took to the streets of the city. Shocking videos of demonstrators openly using the Nazi salute (a criminal offence in Germany) and chasing and attacking people of foreign appearance rapidly appeared online.

The reverberations of the riots continue to roil German politics and society. They appear to have given a massive boost to the right-wing AfD party, for example, which according to some opinion polls is now in second place in Germany. And last week, Angela Merkel removed the head of the domestic intelligence agency, Hans-Georg Maaßen, from his post after he faced criticism for his reaction to anti-immigrant protests in the city of Chemnitz. He had cast doubt on the authenticity of the videos showing dark-skinned people being chased and attacked.

What’s going on? How did many Germans become so worked up about a street brawl? This question intrigued Ray Serrato, a Berlin-based digital researcher who first noticed it when his wife’s uncle showed him a YouTube video that claimed the rioters had been Muslim refugees. The video was cheaply produced and uploaded by an obscure fringe group and yet it had had nearly half-a-million views – far more than any legitimate news video of the riots. How was this possible?

So Serrato started digging, looking for information on every Chemnitz-related video published on YouTube this year. What he found, according to a New York Times report, is that the platform’s recommendation system consistently directed people toward extremist videos on the riots — then on to far-right videos on other subjects. “Users searching for news on Chemnitz would be sent down a rabbit hole of misinformation and hate. And as interest in Chemnitz grew, it appears, YouTube funnelled many Germans to extremist pages, whose view-counts skyrocketed.”

Nobody who knows anything about YouTube will be surprised. Time and again, researchers have discovered that when videos with political or ideological content are uploaded to the platform, YouTube’s “recommender” algorithm will direct viewers to more extremist content after they have watched the first one. Given that most people probably have the autoplay feature left on by default, that means that watching YouTube videos often leads people to extremist sites.

Strangely, this doesn’t just hold for political or other types of controversial content. Zeynep Tufekci, a well-known technology commentator, found that videos about vegetarianism led to videos about veganism, videos about jogging led to videos about running ultramarathons, and so on. “It seems,” she wrote, “as if you are never ‘hardcore’ enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes.”

Given its billion or so users, she concluded that “YouTube may be one of the most powerful radicalising instruments of the 21st century.”

Is this because engineers working for Google (which owns YouTube) have a radicalising or extremist agenda? Unlikely; most software engineers I know are terminally uninterested in politics, which they regard as a hobby for underemployed people with low IQ scores.

The key factor is a fiendish combination of machine-learning software and Google’s business model. Google is an advertising company masquerading as a tech giant. It sells our attention to advertisers who are willing to pay for it. So the company’s overriding incentive is keep people watching YouTube for as long as possible. And its recommendation algorithm is informed by research into human psychology, which suggests that people are drawn towards content that is more extreme or incendiary than the video they started with. QED.

One of the ironies of the current controversy about Facebook’s malign influence on democracy is that it has diverted attention from Google’s equally destructive properties. Yet one of the things we learned from research into the alt-right infrastructure after Trump’s election is that YouTube plays an absolutely pivotal role in that ecosystem. So if governments really wanted to do something about online extremism (a big “if”, IMHO), then they could offer Google a straightforward deal: change the recommendation algorithm to deprioritise extreme content or we’ll do it for you.

What I’m reading

John Naughton’s recommendations

Self-stealing cars
As reported in the Register, some security researchers showed how easy it was to clone the Tesla S keyfob. Apparently, this hole in the self-driving car’s defences has now been patched.

Hard-hearted Apple
Last week, it made a big deal of putting an ECG monitor into Apple Watch 4, but as Business Insider reported, Apple strangely omitted to mention that another company has for some time been selling a strap for existing Apple watches that does the same thing.

More for your megapixel
Apple has more than 800 engineers working on improving the iPhone camera. Tech Crunch asked a professional photographer to explain what they’ve been doing.