Parents need secret ‘safe words’ with children to tackle scams

Parents should agree secret 'safe words' with their children to tackle fraud scams, says a Home Office minister
Parents should agree secret ‘safe words’ with their children to tackle fraud scams, says a Home Office minister - PixelsEffect/E+

Parents should agree secret “safe words” with their children to tackle “Hi Mum” fraud scams, says a Home Office minister.

Lord Hanson, the minister responsible for tackling fraud, said safe words or phrases could help ensure parents knew that it was genuinely their children who were contacting them rather than a fraudster.

His warning comes as it emerged that fraudsters are using AI to mimic children’s voices in a new phone cash scam targeting parents with fake cries for help.

Home Office officials said just three seconds of speech from videos on TikTok, Instagram or other social media sites are all that is needed to generate a clone of someone’s voice.

The use of AI by fraudsters is an attempt to get round the increasing suspiciousness of parents who have got wise to the “Hi Mum” or “Hi Dad” texts that have been used to steal millions of pounds from families.

In that scam, parents typically received a WhatsApp message from someone pretending to be their child in distress, saying they have lost their phone and need money sent to them urgently to help them to get a taxi home.

Use of AI technology

The fraudsters – often based abroad – are now using AI technology to leave recorded messages or make calls that perfectly replicate the voice of someone’s child.

It means a parent can get a call and hear their child’s voice telling them they are in danger and urgently need money – and by using AI, the fraudsters can have a two-way chat with the parent.

They ask for money to be transferred to the account of a third party, who they say they are with, or that they owe money to landlord or someone they have got into trouble with and is threatening them for money.

Lord Hanson said parents should agree a “safe phrase” that their children can always use when calling to ask for help so that parents know the message is genuine.

He said: “AI presents incredible opportunities for our society but we must also stay alert to dangers.

“We have been delighted to support this initiative through the Stop! Think Fraud campaign and ensure the public get practical advice about how to guard against these kind of scams.”

Millions of fraud incidents

In the year to March, 3.2 million fraud incidents were reported by households to the Crime Survey for England and Wales.

Over the same period, UK Finance – the umbrella body for Britain’s banks – said 554,293 financial frauds were reported by its members, including “Hi Mum” scams, a 20 per cent increase on the previous year when the total was 460,537.

Research by Starling Bank last month suggested that voice cloning scams could catch many people out.

Nearly half (46 per cent) of people do not know this type of scam even exists, according to a survey for the bank.

In the survey, about one in 12 (eight per cent) people said they would send whatever money was requested, even if they thought the call seemed strange.

The survey indicated that nearly three in 10 (28 per cent) of people believed they had potentially been targeted by an AI voice cloning scam in the past year, according to the Mortar Research study among more than 3,000 people across the UK in August.

The bank also suggested that some people could consider agreeing a “safe phrase” with close friends and family members to help them verify that callers are genuine.