TikTok creates European Safety Advisory Council to inform on content moderation

·3-min read

TikTok has created a safety council to advise the social media platform on content moderation policies and practices in Europe.

The European Safety Advisory Council will be made up of nine industry leaders, as well as figures from academia and civil society, who will help the firm develop policy.

The council will be tasked with advising the platform around existing and emerging issues that may impact TikTok’s users, the company said, and to advise on how to approach content moderation of certain subjects.

Content moderation on social media has become a prominent issue around the world amid the increased spread of misinformation and other forms of harmful content, particularly during the pandemic, with some calling for increased scrutiny and regulation of online platforms.

The Government’s Online Harms Bill, which will introduce stricter regulation for tech firms with harsh penalties for failing to protect users, is expected before Parliament later this year.

However, some platforms have attempted to make their own efforts to improve self-regulation – last year Facebook announced the creation of an independent Oversight Board to deliberate on content moderation decisions made by the social network.

TikTok’s head of product policy in Europe, Julie de Bailliencourt, said it was important for the video-sharing app to seek outside input on how to handle the wide range of content it sees on the service.

“The idea really is to have open, candid conversations with them (the council) on a range of topics that we want to discuss, so that we can take our policies and our safety measures to the next level,” she told the PA news agency.

“We’re very ambitious, and we want to think of solutions that will really work for TikTok, to make sure that people come and have fun and enjoy themselves and still have a really safe experience.

“And those experts we think can really help us understand emerging trends and things we may not have thought through – get their opinion on some of the things that we’re suggesting to help us get to a really great level of safety on the platform.”

She added the council would be about more than creating policy, and would be a vital tool for the platform to help it craft safety features around education and how to start conversations on sensitive topics.

“For example; young people who are coming out of an eating disorder,” she told PA.

“We’ve had lots of conversations on this already on what may feel intuitively like the right thing to do when you don’t know about this topic or when you don’t have a lived experience but may actually end up being unintentionally harmful or not the right thing to do.

“So when we speak with experts we want them to think around corners and tell us ‘hey, what you’ve got there is great but there’s one thing you haven’t quite thought through’.”

The nine inaugural council members include Alex Holmes, the deputy chief executive of UK non-profit The Diana Award and founder of Anti-Bullying Ambassadors; Ethel Quayle, professor of forensic clinical psychology in the School of Health in Social Science at the University of Edinburgh; and Seyi Akiwowo, founder and chief executive of UK charity Glitch.

Also on the council are Satu Raappana, manager of online crises services at the Finnish Association for Mental Health; Judy Korn, head of the German organisation Violence Prevention Network; Kristine Evertz from Dutch non-profit Blijf Groep; Justine Atlan chief executive of Association e-Enfance NGO, the French NGO for young people’s safety online; Robin Sclafani, director at CEJI – A Jewish Contribution to an Inclusive Europe; and Ian Power, head of Irish non-profit Community Creations.

TikTok confirmed it hopes to add more members in the future from more countries and different areas of expertise.

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting