Google, Facebook and other internet companies could be prosecuted if they do not stop extremist videos from being seen on their websites by people in Britain, The Daily Telegraph can disclose.
Ministers are considering a new law which would mean Google – which owns YouTube - and other social media sites like Facebook and Twitter can be prosecuted if they allow such videos to be disseminated.
Theresa May, the Prime Minister, made clear her displeasure at internet companies that publish extremist content on Friday, saying “the ball is in their court” over taking action.
Google publicly apologised this week after the growing scandal over extremist videos on YouTube led to a series of companies pulling their adverts from the internet giant.
Google, which owns the video sharing website YouTube, and other social media sites have an agreement to take down extremist content within 24 hours when they are alerted to it.
But ministers are worried that this still allows the videos to be viewed thousands of times before they are taken down.
However ministers in the Home Office are now looking at a new law to force the websites to take down content immediately or face court action.
Officials are trying to work out how the law can be enforced in the UK given that many films that can be seen in the UK are hosted on foreign websites.
One source said: “We want to see them take on their responsibility. The problem with the law is in itself we can do what we can but these are global companies.”
The Home Office’s talks with the internet companies are being led by Baroness Shields, who as Joanna Shields was a former Facebook executive.
The news came as Mrs May’s spokesman said that “social media companies can and must do more”.
The Prime Minister’s official spokesman said: “The fight against terrorism and hate speech has to be a joint one. The government and security services are doing everything they can and it is clear that social media companies can and must do more.
"Social media companies have a responsibility when it comes to making sure this material is not disseminated and we have been clear repeatedly that we think that they can and must do more. We are always talking with them on how to achieve that
“The ball is now in their court. We will see how they respond.”
Currently websites are deemed not to have liability for content as long as they take remove it swiftly when given notice.
A recent survey in Germany found that Google took down 90 per cent of extremist sites within 24 hours of being notified.
Google sources stressed the company had a good record at taking down extremist content, adding that 98 per cent of flagged videos are reviewed within 24 hours.
One said: “We move content swiftly when it is drawn to our attention.”
Earlier this week MPs suggested that Google risked breaking anti-terrorism laws by allegedly failing to remove illegal recruitment videos posted by banned far-Right group National Action from YouTube.
Robert Buckland, the Solicitor-General, said that Google could be breaking the law if it is found to have been "reckless" in allowing the material to remain online.
Under the Terrorism Act 2006 it is an offence to disseminate terrorist material either "intentionally" or "recklessly".
Marks & Spencer became the latest company to withdraw its adverts from Google, following in the footsteps of the likes of McDonald’s, L’Oreal and the BBC.