The rise of the aggro-rithm: can misogynistic content be stopped?

<span>Social media is designed to keep users online as long as possible, hooking people with emotionally triggering content. Picture posed by model.</span><span>Photograph: miniseries/Getty Images</span>
Social media is designed to keep users online as long as possible, hooking people with emotionally triggering content. Picture posed by model.Photograph: miniseries/Getty Images

If you’ve had the misfortune of stumbling across misogynist videos from influencers online, you’ll be aware how toxic this content can be. But did you know that more than two-thirds of boys aged 11 to 14 have been exposed to this kind of harmful, damaging “manosphere” content? Or that 70% of teachers noticed a rise in sexist language being used in the classroom over the 12 months to February 2024?

This research was brought to life in a powerful short film earlier this year, called The Rise of the Aggro-rithm. It follows a boy’s gradual descent into misogynistic thinking – a journey that leaves him lonely and sad, with negative feelings towards his female teacher and even his own sister.

Made by Vodafone and charity Global Action Plan, the film shows the impact of harmful algorithms, powered by AI, are having on teen and tween boys. It reflects a growing level of concern among parents, with one in five having noticed a gradual change over time in the language their sons use to talk about women and girls. Experts are now urging families to start talking about what’s potentially flooding their sons’ phones, and how it’s reaching them.

Psychologist Dr Elly Hanson says: “Social media is designed to keep you online for as long as possible, so it shows you things that are emotionally triggering. They’re playing on emotions such as shock, horror, insecurity, paranoia, superiority, outrage, sexuality – they’ve discovered these emotions work to hook people in.”

Worryingly, many boys come across the content when searching for something unrelated, such as fitness or gaming videos. Hanson says that it’s important to explain how social media algorithms are designed, because it invites your child or teenager inside the conversation, and is much more powerful than telling them not to watch it.

“Questioning things is all part of being a normal teenager,” she says. “So let’s harness that proclivity and invite them to question the levers that are being pulled in order to manipulate them online.”

Simply explaining that these platforms profit directly from you engaging with their content is a powerful first step, says Hanson. What’s most engaging is often what’s controversial and conspiratorial, which is how we’ve ended up with a raft of influencers serving up warped ideas of masculinity that are sexist, offensive and often aggressive. This results in negative and disrespectful behaviour towards women and girls, but it also damages boys’ mental health and capacity for relationships. Two-thirds of boys state that seeing harmful negative content online has made them feel worried, sad or scared.

Kate Edwards, associate head of child safety online at the NSPCC, says that parents need to realise how quickly their child’s phone or tablet can become flooded with toxic content. “Social media now consists largely of short-form content – rapid, quickfire videos being fed to you. And if you watch something to the end, or if you interact with it, like it or comment on it, the app will feed you more and more similar content. It can take you down a rabbit hole very quickly,” she says.

“There are steps you can take to try and teach the algorithm that you don’t want more of what you’re seeing – look for the ‘hide’ button or the ‘I didn’t like this’ option. Explore the different settings on the app, both by yourself and with your child.”

Vodafone has co-designed a digital parenting toolkit with the NSPCC to help parents get ahead of the potential risks. It’s full of conversation starters, activities and tips to help keep young people safe when using the internet, with advice on what to do when they come across something that’s not right.

Sir Peter Wanless, chief executive of the NSPCC, says he’s particularly proud of the partnership with Vodafone, as it’s helping parents navigate an online world that can be just as overwhelming and confusing for them as it is for children. He says: “The toolkit encourages families to have open conversations about their child’s phone usage, for instance, talking through situations that might occur when online. It also explains safety features that can be found on phones, and setting boundaries, such as implementing screen time limits.”

However, screen time rules and parental controls are just one piece of the puzzle. While parents can help stem the flow of harmful content, there is a growing consensus that if we are to break the cycle once and for all, it’s going to take action from the tech companies themselves.

To campaign for this, Global Action Plan has introduced a petition calling on regulators such as Ofcom to insist that platforms seize control from AI-powered algorithms, and enforce “safety by design”. This was a key element of the 2023 Online Safety Act, however, fears have been raised that the apps will get away with doing as little as possible.

“Even if parents do everything they can, our children are still vulnerable to manipulative algorithms. We can and should do the best we can, but the greatest power lies with the tech companies and the regulator,” says Hanson.

Find out more about Vodafone’s pledge to help 4 million people and businesses cross the digital divide here