Advertisement

Google directing people to extreme content and conspiracy theories, Sky News finds

Google is directing people towards misinformation and conspiracy theories by placing YouTube videos prominently in its search results, an investigation by Sky News has found.

Sky News found a number of search terms where extreme content and conspiracy theories featured highly in Google's results - including several topics of public concern. They appeared in the box marked "Videos", which Google places on the first page of certain searches, usually among the top results.

For instance, a search for the phrase "5G birds" - a search term relating to a conspiracy theory that upgraded mobile networks will kill birds - showed an article debunking the theory by fact-checking organisation Snopes as the first line.

Immediately underneath, the Videos box presented three YouTube videos repeating the claim. The first video, which came with a thumbnail picture, was "5G is killing birds What is it doing to us," a video from Life Energy Solutions, a New Zealand-based YouTube channel with 3,096 subscribers.

After Sky News alerted Google, which owns YouTube, to the search results, the Video box no longer appeared for this search term or other related terms.

Other search terms showed similar results. A search for "Maddy McCann", made while that term was trending in mid-March after the release of a Netflix documentary about the missing child, produced a Videos box in which the second item was "Five Creepy Facts About The Madeleine McCann Case," a video by a YouTube channel called Fully Sourced.

The video falsely claims that Kate McCann, Madeleine's mother, is a nymphomaniac, and links Madeleine's disappearance to John Podesta, Hillary Clinton's former campaign manager, and his brother Tony, a well-known conspiracy theory which has repeatedly been debunked. The video, which had 174,279 views in March, has now been watched over five million times.

As well as giving relatively small channels unusual prominence, Google's Videos box featured videos from popular YouTube channels, without apparent regard for balance. In a search for the phrase "yellow vests", the Videos box was the fourth result, after the Wikipedia page and the "Top Stories" box. It showed two videos from the same publisher: RT, the Russian-government backed television network formerly known as Russia Today.

Clicking on the arrow to the right of the box revealed three further videos, one from RT and two from Ruptly, a video news agency owned by RT.

A spokesperson for Google and YouTube told Sky News that deliberate misinformation online was "a major concern", which it had been battling by cutting off some sites' revenue and prioritising authoritative sources, especially on sensitive topics. Searches around controversial topics such as vaccines or chemtrails produced no Video boxes on Google.

Sky News showed its findings to Will Moy, director of independent fact-checking charity Full Fact. "I'm concerned that when we look for information we ought to get the stuff that actually helps us make up our minds," he said.

"It's great that YouTube provides a platform for anyone to say what they want. It's not so great that they then amplify that to everyone who is casually searching. That seems to be a risk that they haven't fully understood."

Around two-thirds of the traffic to Full Fact's website comes from Google, highlighting the crucial role the search engine plays in directing people to information. "It is rightly often called the homepage of the internet," said Mr Moy.

Until recently, many more searches showed contentious or divisive results in the Videos box. A search for "Tommy Robinson" on 19 March, the day after the far right activist's contempt of court case was delayed, produced a box with two videos from Mr Robinson's YouTube account, and a third video from a Robinson-supporting channel called ACTIVE PATRIOT UK. The Videos box was the third item on the results page, after Mr Robinson's Wikipedia page and another box containing news articles from publishers.

In another search for "Tommy Robinson", this time on 31 May, the Videos box again ranked highly. It showed videos from the Guardian and Sky News, alongside a video from far right channel Rebel Media. Clicking to bring up further videos revealed two more Robinson-supporting videos, including one from a channel which refers to Islam as a "poison".

A search for "Muslim immigration Britain" on 25 April showed four videos, three of which repeated far-right tropes (the fourth was from Channel 4). One was from far-right YouTuber You Kipper.

The Video boxes on these searches disappeared after YouTube placed restrictions on Tommy Robinson's YouTube page on 2 April, then purged numerous far right channels on 5 Jun 2019.

Google places a number of boxes in its search results, including a "Shopping" box for commercial searches, a box containing "Related searches" and a box featuring answers to questions that "People also ask." The results they contain are not filtered through Google's normal search algorithm, which searches the entire web for relevant results, but come from separate, dedicated sources.

Although Google's search results are personalised, it is possible to see how the search engine appears to the average user, by searching in incognito mode, which does not base weight results by factors such as browsing history or location.

This finding raises further questions about the role of YouTube, the largest video site in the world, which has been accused of helping radicalise people by pushing them towards extreme or divisive content.

"The algorithm for the last 10 years has been pushing people down rabbit holes," says Guillaume Chaslot, a former YouTube software engineer who now runs a non-profit called Algo Transparency. "That is most efficient for watch times."

"Whether it's a terrorist rabbit hole or one that will make you believe a crazy conspiracy theories or make you watch crap all day, it doesn't matter," says Mr Chaslot, who describes YouTube as a "radicalisation engine".

YouTube strongly denies such claims. Its spokesperson told Sky News it no longer used watch time as a metric, that it was not in its ethical or financial interest to recommend harmful content, and that it had changed its secretive algorithm to reduce the impact of what it described as "borderline content", content which did not break its rules, but "comes right up to the line".