Fraudsters using AI to mimic childrens voices to steal millions of pounds from unsuspecting parents

Fraudsters are using AI to mimic childrens voices in a new phone cash scam targeting parents with fake cries for help.


Fraudsters are using AI to mimic childrens voices in a new phone cash scam targeting parents with fake cries for help.

Ministers are now warning families to develop secret codewords or phrases to use in moments of distress so they wont be fooled by the calls and part with their money.

Home Office sources say just three seconds of speech from videos on TikTok, Instagram or other social media sites are all that is needed to generate a clone of someones voice.

The AI fraud is the latest incarnation of the Hi Mum scam, which has been used to steal millions of pounds from unsuspecting parents in the UK in recent years.

In that scam, parents typically received a WhatsApp message from someone pretending to be their child in distress, saying they have lost their phone and need money sent to them urgently to help them to get a taxi home.

Mumsnet internet forum founder Justine Roberts (pictured) said: Hi mum scams, where fraudsters pretend to be a child needing help, are designed to prey on parents emotions

Mumsnet internet forum founder Justine Roberts (pictured) said: Hi mum scams, where fraudsters pretend to be a child needing help, are designed to prey on parents emotions

Speaking to The Mail on Sunday, Fraud Minister Lord Hanson urged parents to agree a safe phrase that their children can always use (stock photo)

Speaking to The Mail on Sunday, Fraud Minister Lord Hanson urged parents to agree a safe phrase that their children can always use (stock photo)

Now, thanks to AI technology, fraudsters – often based abroad –are leaving recorded messages or making calls that perfectly replicate the voice of someones child.It means a parent can get a call and hear their childs voice telling them they are in danger and urgently need money – and by using AI, the fraudsters can have a two-way chat with the parent.

Their cries for help can range from their bank card and phone having been stolen, so they are with a friend who can withdraw money for them if the cash is transferred to that third partys account immediately.

Or they need to urgently pay, for example, their landlord or ­someone they have got into ­trouble with and is threatening them for the money.

Speaking to The Mail on Sunday, Fraud Minister Lord Hanson urged parents to agree a safe phrase that their children can always use when calling to ask for help so that parents know the message is genuine. He said: AI presents incredible opportunities for our society but we must also stay alert to dangers.

We have been delighted to support this initiative through the Stop! Think Fraud campaign and ensure the public get practical advice about how to guard against these kind of scams.

A source close to Home Secretary Yvette Cooper added: Just imagine the risks when this technology enables fraudsters to steal the face and the voice of the person they are pretending to be, and carry out a video conversation with their intended victim.

There is clearly a challenge for government to control that threat but it also shows why families need things like safe phrases to protect themselves too.

In the year to March, 3.2 million fraud incidents were reported by households to the Crime Survey for England and Wales.

Scammers would often spoof the number of a bank to gain a victims trust, before convincing them that their account had been subject to fraudulent activity (stock photo)

Scammers would often spoof the number of a bank to gain a victims trust, before convincing them that their account had been subject to fraudulent activity (stock photo) 

Home Office sources say just three seconds of speech from videos on TikTok , Instagram or other social media sites are all that is needed to generate a clone of someones voice

Home Office sources say just three seconds of speech from videos on TikTok , Instagram or other social media sites are all that is needed to generate a clone of someones voice

Over the same period, UK Finance – the umbrella body for Britains banks – said 554,293 financial frauds were reported by its members, including Hi Mum scams, a 20 per cent increase on the previous year when the total was 460,537.

Mumsnet internet forum founder Justine Roberts said: Hi mum scams, where fraudsters pretend to be a child needing help, are designed to prey on parents emotions. If you think your son or daughter is in trouble its natural to want to act quickly, and plenty of parents who would normally consider themselves pretty scam-savvy have been caught out by anxiety-inducing texts purporting to come from their child.

The use of AI technology to mimic voices will make it even harder to spot the scammers.

Parents on Mumsnet have suggested agreeing on a codeword or question for use in these scenarios, so its great to see the Government promoting this measure.

Taking the time to sit down and have a conversation with your kids about how youll handle an emergency makes it less likely that youll be caught out, and having an established codeword is a simple way to protect yourself and your family.

A Home Office spokesman said: Artificial Intelligence has already begun to transform the way scams are perpetrated on victims and how believable they can seem.

The Governments new fraud strategy will have the threat from AI-enabled scams at its heart. It is vital that we get ahead of the game with these risks or we will see many more innocent people falling prey to the fraudsters.

MumsnetWalesYvette CooperTikTok
Источник: Daily Online

Полная версия