Voices: I worked for Twitter – here’s why the UK’s new Online Safety Bill could kill it

We now know that when the government’s new Online Safety Bill comes into force, it could mean a potential jail sentence for tech bosses who don’t take proper steps to keep children safe. Platforms that ignore flags from Ofcom to tackle children’s exposure to content such as self-harm, eating disorders or suicide could face two years in prison. So, what does this mean for Elon Musk and Twitter?

I can give you my insight: I worked for the social media platform for seven years as a content curator. I was responsible for helping explain what’s going on and amplifying content that did precisely that. To understand the ramifications the new bill will cause Musk, we need to look at the reason young people are being served this content to begin with.

Twitter runs two separate timelines on its apps, now called “For You” and “Following”, defaulting to the former. “Following” is fairly clear – it’s a linear feed of posts, replies and retweets by people you have opted to follow, presented in the order in which they were sent. It’s a real-time snapshot; essentially it’s what you signed up for.

“For You”, which until recently was known as the “home timeline”, is different. It sorts posts from people you follow and serves you the ones you’re likely to be most interested in first, with the algorithm making those choices based on who you interact with most regularly. You’ll notice that’s not all you get, however – that same algorithm uses your engagements and settings (and the engagements of others in your network and accounts it deems similar to yours) to serve you a selection of tweets from people you haven’t chosen to follow as well. This is content it assumes you’re going to like.

People complain about this, but it’s a pretty effective system – a good way of finding interesting accounts and keeping up to date with news you actually might be interested in. From Twitter’s perspective, the more accounts you interact with, the more it can learn about you.

It’s these algorithmic timelines that stir the pot and stop a social network from stagnating. It’s not just Twitter, either: Facebook, Instagram and TikTok do the same thing. The days of your social feed being a list of posts in the order they were posted by everyone you follow are long gone.

This is important for the Online Safety Bill, because it’s these algorithms that can cause the problem. You’re not just seeing content you’ve opted into, you’re seeing things someone you follow has opted into. Say you have lots of friends that love Star Trek or Arsenal, or Taylor Swift? You’ll see more tweets about those topics. If you engage with those tweets then you’ll see even more of them, because the algorithm reckons it’s onto something.

An argument about Star Trek movies, your ex’s obsession with the Gunners or your brief dalliance with Taylor’s Folklore during lockdown means you’re folded into the fandom content. You see more of it, you interact with more of it, so it sends you even more and it becomes a self-fulfilling prophecy.

And these algorithms are not easy to control: they can be streamlined and directed, but it takes constant attention and finetuning. Plus, there’s a huge development cost to training the AI and it needs skilled developers to keep on top of – many of whom have left Twitter in the great Musk exodus.

This is fine when it’s pop stars, TV shows and football teams, but what if it’s more sinister than that? Eating disorders are a good example. There are communities of people on Twitter who struggle with eating disorders like anorexia. In most cases, they’re people supporting one another through their recovery, but there are also communities of people who encourage one another in their disorders.

If you’re part of these communities, the algorithm – when left to its own devices – will continue to serve you up triggering or harmful content. It works the same way with people struggling with poor mental health or addiction. The algorithm isn’t always great at sorting the good from the bad; the wholesome from the harmful.

The new act will force social networks to police timelines – especially for children – much more closely. There are, practically, two ways this can be done: one is by human moderators who deal with reports and scan problematic trends, and the other is by constantly updating the algorithm and the connections in what’s known as the knowledge graph. And it’s going to be expensive. People cost money, lots of money – and so does the constant maintenance and tweaking of complicated algorithms.

That’s a problem for Twitter. Meta, the company that owns Facebook and Instagram, has more money than God. TikTok is riding a crest at the moment and has huge resources from its Chinese owners. But Twitter? Twitter is bleeding money and people. Remember a few months ago when Elon Musk let go of half his workforce? Content curation and content moderation were a large part of those layoffs – in other words, the teams that scrub the timeline and keep it shiny. More people left shortly afterwards.

Musk paid over the odds for Twitter, his loans have huge interest and his outgoings are still massive. It’s more than possible that he simply doesn’t (and won’t) have the resources to comply with strict new laws.

Musk is unlikely to put himself in a position where he is at risk of doing time in prison, but that doesn’t mean he suddenly has the resources (people and money necessary to toe the line) to keep out of Ofcom’s way.

He could shut down operations in the UK, but he can’t afford to lose its revenue; and besides. the EU already has similar laws in place. Twitter’s new management may find itself stuck between an obligation to the responsibilities imposed by the new laws it cannot keep; and a penalty (the fines or sentences for imposed for not adhering to tech regulations) it cannot afford.