German synagogue shooting video viewed by 2,200 people on Amazon's Twitch

Margi Murphy
Policemen climb over a wall close to the site of the shooting - DPA

Amazon-owned Twitch has confirmed that around 2,200 people used the popular video-sharing website to watch a deadly shooting outside a synagogue in Halle, Germany on Wednesday morning, before it was taken offline. 

The shooter, named by local press as Stephan B, a 27-year-old German, created a channel on Twitch two months prior to the attack, in which two were killed. He had only used it to livestream once before. 

Twitch said that five people watched an initial live version of the shooting, which broadcast footage from the suspect’s head-mounted camera. The stream showed the suspect trying to force his way into the synagogue packed with worshippers while shooting and throwing what one witness described as “Molotov cocktails”.

Alerted by the gunfire, they quickly barricaded the doors.  The attacker then turns his attention to the streets, shooting dead a woman who was passing by a nearby Jewish cemetery. He continued to a kebab shop nearby, where he shot one man dead. 

During the 35-minute stream, the suspected far-right fanatic can be heard saying in English: “My name is Anon and I think the holocaust never happened...feminism is the cause of declining birth rates in the West which acts as a scapegoat for mass immigration, and the root of all these problems is the Jew.”

Once the broadcast ended, it automatically uploaded a version that was viewed by 2,200 people in the 30 minutes it took Twitch to shut it down.

A Twitch spokesman said that the video had been shared in a "coordinated" manner on other messaging services. Typically, social networks like Twitch, YouTube and Facebook will recommend videos when they begin to surge in popularity. Twitch said this was not the case. It said that the Halle shooting copy was taken down by a moderator after a viewer reported it through its flagging system. 

The arrested man, described as a "white German" was last night being treated in hospital for injuries sustained during his attempt to escape.

It is not the first time terror attacks and disturbing incidents have been shared online. Facebook  said it removed 1.5 million videos of the New Zealand mosque attack in the 24 hours following the events in March.

The video spread quickly across YouTube and Twitter, and was widely shared in private messaging apps after the technology companies had stamped out any public posts. Videos could still be found on many social networks weeks after the event.  Facebook said it is investing in artificial intelligence that can detect videos like these and remove them as quickly as possible.