Advertisement

Bone-Chilling AI Scam Fakes Your Loved Ones’ Voices to Demand Hostage Ransom

Hostage Situation

Scammers have already been using the advent of AI to get smarter and more unhinged — including spoofing voices to demand ransom for loved ones held "hostage."

Take this chilling anecdote in The New Yorker, which illustrates the phenomenon in grim new detail. In it, a Brooklyn couple who used the names Steve and Robin to protect their identities described waking up in the middle of the night to an unsettling phone call from the husband's parents.

Robin, the wife, recalled initially declining the call from her mother-in-law Mona that roused her from slumber, presuming it was a butt-dial from her early-to-bed in-laws down in Boca Raton. When it rang again, she answered and heard wails: "I can’t do it, I can’t do it."

"I thought she was trying to tell me that some horrible tragic thing had happened," Robin said, and her mind began running through all manner of horrible scenarios that could have happened to her in-laws or her own parents, who also spent winters in Florida.

"Mona, pass me the phone," she heard her father-in-law Bob say, before imploring her to "get Steve, get Steve."

The woman roused her husband, who works in law enforcement, and as he picked up the phone, he heard an unfamiliar man's voice.

"You’re not gonna call the police," the man told Steve. "You’re not gonna tell anybody. I’ve got a gun to your mom’s head, and I’m gonna blow her brains out if you don’t do exactly what I say."

The man implored Steve to send him $500 on Venmo — which even in the moment seemed like an unusually small amount for a hostage situation — and then had another woman, perhaps an accomplice, pump him for $250 more.

After the second person hung up, the couple called their parents to see if they were ok, only to find that they were sound asleep and safe in their beds.

App For That

Though it's unclear what software the scammers used in Steve and Robin's case, they'd have several off-the-shelf options, as The New Yorker explains, that could have cloned Steve's parents' voices or those of others who've been subjected to this disgusting scam.

Among them is ElevenLabs, a New York-based AI firm that's used by The New Yorker itself and lots of other publications for all kinds of tasks, including article narration and text-to-voice translations.

It can also, however, be used to clone anyone's voice with far less recorded audio as training data than ever before.

"You can just navigate to an app, pay five dollars a month, feed it forty-five seconds of someone’s voice, and then clone that voice," Hany Farid, a generative AI expert at the University of California, Berkeley, told the magazine.

How the scammers got Steve's parents' voices, we and they will probably never know — but they did get their money back from Venmo, the article reports.

More on AI scams: Hackers Steal $25 Million by Deepfaking Finance Boss