Google has been condemned for not immediately taking down graphic footage of the terrorist attack on two mosques in New Zealand.
At least 49 people have been killed and 48 injured in shootings at Masjid Al Noor and the Linwood Masjid mosques in Christchurch.
One man has been charged with murder and three more held in custody over the attacks, which began at 1.40pm local time on Friday.
One of the gunmen live-streamed the attack on Facebook and identified himself as an Australian called Brenton Tarrant.
Soon afterwards, the footage was available on Google, YouTube, Reddit, and other social media sites.
Google’s initial response was to place an ‘inappropriate content’ warning overlaid on top of the film, but users still had the option of clicking through to watch the attack unfold.
The footage shows the gunman picking up the gun, entering the mosque and opening fire on his victims.
Their response has been widely condemned, with Home Secretary Sajid Javid saying ‘enough is enough’ and accusing YouTube, Google, Facebook and Twitter of not taking ownership of this content being shared on their networks.
— Sajid Javid (@sajidjavid) March 15, 2019
Deputy leader of Labour Tom Watson also denounced Google, the parent company of YouTube, for failing to fully remove the broadcast, saying their response was ‘not good enough’.
Google have contacted me to explain that they posted the "inappropriate" content warning on the NZ massacre footage while they "reviewed the video" for YouTube. Not good enough. They should have just taken it down, then reviewed it. pic.twitter.com/5Zh2IfxwgR
— Tom Watson (@tom_watson) March 15, 2019
In another tweet he said: “If broadcasting mass murder is not a violation of YouTube’s terms of service, then what is?”
Downing Street has also demanded UK news and media companies remove from their websites terrifying footage.
Theresa May’s spokeswoman said: “Facebook, Twitter, YouTube and other providers have taken action to remove the video and other propaganda related to the attack.
“The government has been clear that all companies need to act more quickly to remove terrorist content.
“There should be no safe spaces for terrorists to promote and share their extreme views and radicalise others.”
Facebook New Zealand spokeswoman Mia Garlick said the video of the attack was ‘quickly removed’ and any ‘praise or support’ for it was being taken down.
In a statement, she said: “Our hearts go out to the victims, their families and the community affected by this horrendous act.
“New Zealand Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter’s Facebook and Instagram accounts and the video.
“We’re also removing any praise or support for the crime and the shooter or shooters as soon as we’re aware.
YouTube released a statement saying it was ‘working vigilantly’ to remove the footage.
Our hearts are broken over today’s terrible tragedy in New Zealand. Please know we are working vigilantly to remove any violent footage.
— YouTube (@YouTube) March 15, 2019
An archive of a Facebook page thought to belong to attacker Brenton Tarrant contained dozens of posts in the last week about multiculturalism in Europe, several referring directly to the UK.
Among them were YouTube recordings of speeches by former British Union of Fascists leader Oswald Mosley.
NZ police have asked the public not to share the ‘extremely distressing’ video on any social media platform.
At the time of publishing this article, Facebook has yet to post a statement.