Google has publicly outlined plans to combat the spread of extremist and terror-related content on its platforms.
The pledge was published in an op-ed for the Financial Times written by Google's senior vice-president and general counsel, Kent Walker.
His message comes following a wave of high-profile terrorist attacks, so far targeting London and Manchester.
Walker's op-ed lists several steps that Google will take to tackle the spread of extremist messages and terrorist-related videos.
Firstly, the company will dedicate more resources to tools which utilise machine learning to create faster 'content classifiers' that identify terrorist or extremist content.
Google will also vastly increase the amount of independent experts participating in YouTube's Trusted Flagger programme, to help manually identify such content.
It also intends to take a "tougher stance" on videos that clearly violate YouTube's policies – for instance, those that contain "inflammatory religious or supremacist content". These will appear behind a warning wall and will not be able to be monetised.
Additionally, Google will expand its efforts to debunk and counteract recruiting practices using the "redirect method", which "focuses on the slice of ISIS's audience that is most susceptible to its messaging, and redirects them towards curated YouTube videos debunking ISIS recruiting themes".
The op-ed comes only a few weeks after Theresa May's election manifesto proposed a tighter grip on online giants like Google and Facebook, suggesting measures which would hold them more accountable for content which they host, especially with regard to extremist and violent content.
Google's op-ed is likely in part an attempt to resist such formal regulation by the UK. The online giant wants to show it's taking a proactive stance.
You Might Also Like