Sydney church stabbing: social media pages ‘infamous’ for spreading misinformation taken down

<span>Chris Minns and NSW police commissioner Karen Webb. The premier has expressed concern at the amount of misinformation and violent imagery on social media.</span><span>Photograph: Ayush Kumar/AFP/Getty Images</span>
Chris Minns and NSW police commissioner Karen Webb. The premier has expressed concern at the amount of misinformation and violent imagery on social media.Photograph: Ayush Kumar/AFP/Getty Images

Social media pages “infamous” for spreading misinformation have been taken down after the Wakeley church stabbing attack, the New South Wales premier, Chris Minns, said on Thursday, while expressing alarm at the “wildfire” of rumour and graphic content still proliferating on tech platforms.

On Monday night YouTube was live broadcasting Bishop Mar Mari Emmanuel’s service at the Assyrian Christ the Good Shepherd church. After the stabbing occurred, video clips spread through WhatsApp groups before police had arrived on scene.

A 19-year-old man, Dani Mansour, fronted court on Thursday charged with riot, affray and damage to property for his alleged actions outside the church, where an estimated 2,000 people gathered on Monday night.

Related: eSafety commissioner orders X and Meta to remove violent videos following Sydney church stabbing

Mansour was granted strict bail with a ban on social media access. NSW police based their investigation on Mansour’s Instagram posts, the court heard on Thursday. Police continue to comb through social media material to identify other alleged rioters.

WhatsApp, owned by Meta, is the platform most cited in recent days as a source of much of the violent imagery and misinformation. It has attempted in recent years to limit the speed at which misinformation can be shared by limiting the sending of content to five chats at once, and labelling content in messages that has been forwarded multiple times. Such messages can only be sent to one chat at a time.

Meta said in 2020 the change had helped reduce the spread of viral messages on the platform by 70%.

Since end-to-end encrypting communications on the platform as a measure to protect user privacy, Meta no longer has access to the content of messages so cannot monitor what is spreading. But the company now says it has technology to spot accounts engaging in abnormal behaviour, with 8m accounts banned a month – 75% of which are banned before those accounts are reported by users.

Minns told reporters on Thursday that NSW police and the state government were concerned about the amount of unsubstantiated rumour and graphic content still accessible on social media sites.

“It proves very difficult to foster community cohesion and harmony, to calm down the community, to send messages of unity in a difficult period when social media firms still continue to disseminate terrible pieces of information, untruths, rumours that circulate like wildfire through an anxious community,” he said.

He said in the immediate aftermath of the attack, the NSW government liaised with the federal government and the eSafety commissioner to have pages “that have become famous or infamous for spreading misinformation in the community” taken down.

“They are down, which is good news [to] stop, in many instances, [misinformation] about damage to mosques and churches [that] was being spread like wildfire and inflaming tensions in the community.”

Minns did not specify on which platform the pages were hosted.

The eSafety commissioner has no powers to regulate the spread of misinformation, but since the Bondi stabbing attack on Saturday and the church attack on Monday has been in communication with the platforms about the removal of violent content. Violent content or content inciting violence is classified as “class 1” material under Australian classification law.

The takedown process has involved informal requests to remove some of the more graphic content related to the Bondi stabbing attack, as well as formal notices issued to Facebook’s parent company, Meta, and X over content related to the church stabbing.

Related: Sydney bishop Mar Mari Emmanuel forgives alleged attacker after church stabbing, calls for ‘Christlike’ response

On Wednesday night, a spokesperson for the eSafety commissioner said Meta had complied with the notices, while the compliance of X – the platform formerly known as Twitter before it was bought by the billionaire Elon Musk in 2022 – was still being reviewed.

The attacks and the social media fallout has drawn attention back to the federal government’s proposed misinformation legislation, which would give stronger powers to the Australian Communications and Media Authority. Under the bill, Acma could force social media companies to get tougher on “content [that] is false, misleading or deceptive, and where the provision of that content on the service is reasonably likely to cause or contribute to serious harm”.

The bill’s introduction was delayed last year after initial consultation on the proposal led to claims it would stifle speech online, and would not protect religious speech. But the government has remained committed to releasing the legislation later this year.

On Wednesday the communications minister, Michelle Rowland, said the incidents highlighted the need for action.

“If we needed to see any case study about what can happen when misinformation spreads at speed and scale, we only need to look at what happened in western Sydney the other night – the damage to public property, threats to life and health,” she told the ABC.

“We know the platforms have incredible powers and abilities to be able to examine content on their platforms. Their algorithms are opaque. They need to do more.”

X did not respond to a request for comment.