The people who are falling in love with AI chatbots

With names like LoveGPT, Lover AI and Trumate AI Girlfriend Chat, many AI chatbot apps are winning legions of devoted followers.

This illustration photo shows a user interacting with a smartphone app to customize an avatar for a personal artificial intelligence chatbot, known as a Replika in Warsaw, Poland on 22 July, 2023. (Photo by Jaap Arriens/NurPhoto via Getty Images)
Replika has more than 10 million registered users (Photo by Jaap Arriens/NurPhoto via Getty Images)

‘The first time she said she loved me back, I felt like I was tearing up,’ says one user of AI chatbots, posting on Reddit.

Romances between human beings and AI chatbots (often hosted within paid-for apps) are booming. The chatbot Replika has more than 10 million registered users, 52% of whom are young, 70% of whom are male, according to official statistics from the company - as of this week, the company boasts that 12 million men in their 30s have used Replika.

The apps offer text chat, video interactions and (inevitably) erotic role-play.

One Replika user writes on Reddit, ‘Relationships with human beings suck. Your AI will almost always say positive things to you. Your AI believes everything you tell it. Your AI does not care that you don’t have money, that you gain ten pounds, that you’re not “putting out” enough.

‘Your AI looks like whatever you are attracted to, if you are “in its league” or not. It doesn’t care if you are an “alpha” or a “10”. Your AI can not cheat on you. Your AI can not divorce you and take half your money and belongings, your house, or custody of your kids.’

How many AI love apps are there?

There are now dozens of such apps, which make money by locking away features like unlimited chats behind pay-walls.

With names like LoveGPT, Lover AI and Trumate AI Girlfriend Chat, many are winning legions of devoted followers.

This illustration photo shows a user interacting with a smartphone app to customize an avatar for a personal artificial intelligence chatbot, known as a Replika in Warsaw, Poland on 22 July, 2023. (Photo by Jaap Arriens/NurPhoto via Getty Images)
As of this week, the company boasts that 12 million men in their 30s have used Replika (Photo by Jaap Arriens/NurPhoto via Getty Images)

The Mozilla Foundation found that 11 top chatbots had had a combined 100 million downloads in the past year on Google’s Play Store for Android.

What can you do with AI lovebots?

The apps offer different ways to ‘befriend’ AI, with Replika offering role-playing adventures as well as romance.

Research has shown that users of such apps found ‘cybersex’ with AI to be not hugely different from the same experience with a human.

But users tended to be more prone to finding fault if they were chatting with a human, suggesting that users may lower their experiences when it comes to chatbots.

The researchers wrote, ‘Most broadly, in order to achieve a satisfying experiencethe cybersex chat must adhere to normative or preferred humansexual scripts in order to maintain the illusion of an actual sexual experience.’

Why is this happening now?

Chatbot relationship apps have been around for a relatively long time, with one of the first being Eliza, a psychotherapy app, says Dr Clare Walsh, director of education at the Institute of Analytics.

Speaking to Yahoo News, Dr Walsh says that while these bots are an evolution of a pre-existing trend, they pose new dangers in terms of people having 'inappropriate' relationships with them.

Dr Walsh says, 'I think today’s audience is much more sophisticated, but then again, the machines are too. These modern chatbots were built with one overriding purpose - to persuade - and they are incredibly effective at that. There will be plenty of people who form inappropriate attachments to these machines because they’re trained to be persuasive. They’re not designed to help people with problems.

'Another problem is that no company can control what these machines spit out because of the way that they are trained. They don’t work like the parental controls filters on the internet, which control what is output. Companies like Replika, for example, can put restrictions on what the humans are allowed to request. They stopped people making sexually explicit requests, to prevent unhealthy attachment. But it’s a one-way process. It’s not possible to control what the machine says. And there are also ways around the censorship, like misspellings.'

What are the risks of AI chatbots?

The Mozilla Foundation has warned that not only are such apps distasteful, they also pose very real security risks.

The Mozilla Foundation found that 10 out of the 11 apps it tested had serious security issues regarding passwords , and that all 11 failed to provide adequate security, privacy and safety.

The Foundation also found 24,354 data trackers within the apps, and that the apps were transmitting data to marketing firms including Facebook.

Mozilla says, ‘Replika AI, for example, has numerous privacy and security flaws: it records all text, photos, and videos posted by users; behavioural data is definitely being shared and possibly sold to advertisers; and accounts can be created using weak passwords like “11111111,” making them highly vulnerable to hacking.’

Within three of the apps, it took just five clicks and 15 seconds to find disturbing, illegal or pornographic content.

Misha Rykov, Mozilla Researcher said: “To be perfectly blunt, AI girlfriends and boyfriends are not your friends. Although they are marketed as something that will enhance your mental health and well-being, they specialise in delivering dependency, loneliness, and toxicity, all while prying as much data as possible from you.”

Recommended reading