TikTok and YouTube Shorts target male users with misogynistic content, Irish study finds

The image shows a man using a mobile phone. Male-identified users have been found to be targeted by misogynistic content on social media. In the image shows the upper body of a man. His head is cropped out. He is holding a silver phone in one hand and is using his finger to scroll with the other hand. He is wearing a navy coloured tshirt.

New research conducted by Dublin City University has found that male-identified social media users are being disproportionately fed misogynistic and anti-feminist content on TikTok and YouTube Shorts.

Using ten ‘sockpuppet’ accounts on individual unregistered smartphones, professors from the university’s Anti-Bullying Centre tracked the content recommended to five TikTok and five YouTube Shorts accounts.

Researchers found that algorithms delivered male-supremacist, anti-feminist, and other extremist content to all of the male-identified accounts within just 23 minutes of them being set up, whether the users had sought similar content or not.

Further exploration into the social media accounts found that if they expressed interest in the content by watching videos, the misogynistic content “rapidly increased”.

By the conclusion of the study, researchers had watched 400 videos based on recommendations, equating to nearly three hours, the vast majority of which was considered “toxic”, falling into the ‘alpha male’ and ‘anti-feminist’ categories. Other “toxic” categories included reactionary right and conspiracy, “much of this was anti-transgender content”, the research stated.

The final results of the study concluded that, of the content recommended to the male-identified accounts, 76% of the TikToks and 78% of the YouTube Shorts were misogynistic and anti-feminist in nature.

The study, published today, April 17, was conducted by Professor Debbie Ging, Dr Catherine Baker and Dr Maja Andreasen. 


Speaking about the report’s real-life impact, Professor Ging said, “The findings of the report point to urgent and concerning issues for parents, teachers, policy makers, and society as a whole. Among the authors’ recommendations are better content moderation, turning off recommender algorithms by default and cooperation with trusted flaggers to highlight illegal, harmful, and borderline content.” 

She continued, “They also stress the need for teacher education and the teaching of critical digital literacy skills in schools to equip young people with a better understanding of how influencer culture and algorithms work.” 

She also emphasised that shutting down ‘masculinist’ influencers’ accounts does not appear to have a significant bearing on the misogynistic content that is shared across social media.

Professor Ging suggested, “The overwhelming presence of Andrew Tate content in our dataset at a time when he was de-platformed means that social media companies must tackle harmful content in more sophisticated ways.”

Although misogynistic content ultimately impacts girls and women most severely, Professor Ging believes that boys and men are also damaged by consuming such extremist content through social media, especially in relation to mental health and wellbeing. 

She concluded, “The social media companies must come under increased pressure from the government to prioritise the safety and wellbeing of young people over profit.”


The full report can be found here.

The post TikTok and YouTube Shorts target male users with misogynistic content, Irish study finds appeared first on GCN.