Bank issues urgent scam warning as criminals use AI voice cloning to imitate your family and friends and ask for cash

Women panicked at laptop

Some people could consider agreeing a “safe phrase” with close friends and family members to help them verify the caller is genuine

GETTY
Temie Laleye

By Temie Laleye


Published: 18/09/2024

- 00:01

Some people could consider agreeing a “safe phrase” with close friends and family members to help them verify the caller is genuine

A major bank has issued an urgent warning about a new scam involving AI voice cloning technology.

Criminals are using this advanced tech to imitate the voices of family and friends, asking for money in emergencies.


The scam exploits people's trust in their loved ones' voices, making it difficult to detect fraud.

Starling Bank has raised the alarm, noting that nearly half of Britons are unaware this type of scam exists.

The technology allows fraudsters to replicate a person's voice from just a few seconds of audio, which can be easily obtained from social media posts.

Scammers then use the cloned voice to stage convincing phone calls or voice messages to family members, asking for urgent financial assistance.

Bank sign

Starling Bank has launched a campaign, featuring actor James Nesbitt, over awareness of voice cloning scams

PA

A recent study by University College London (UCL) found that people were unable to distinguish between real human voices and AI-generated ones when played audio clips.

Participants in the study correctly identified the authentic voice only 48 per cent of the time, essentially guessing. However, people were better at recognising AI impersonations of familiar voices, with an 88 per cent accuracy rate for friends' voices.

Prof Carolyn McGettigan, the study's author, stated: "What we're seeing now is that the technology is good enough to mean that listeners may be unable to tell if what they're listening to is the voice of a real person or not."

The study highlights the potential for this technology to be misused in scams and the creation of convincing deepfakes.

Nearly three in 10 (28 per cent) respondents believe they have potentially been targeted by such a scam in the past year, and one in 12 (eight per cent people said they would send money if requested, even if the call seemed strange, a survey by Starling Bank has shown.

The bank's research also found that 46 per cent of people are unaware that this type of scam exists. These findings underscore the urgent need for public education about AI-enabled fraud.

Lisa Grahame, chief information security officer at Starling Bank, warned: "People regularly post content online which has recordings of their voice, without ever imagining it's making them more vulnerable to fraudsters."

She emphasised that scammers only need three seconds of audio to clone a voice, highlighting the ease with which criminals can exploit this technology.


Starling Bank suggested that some people could consider agreeing a “safe phrase” with close friends and family members to help them verify the caller is genuine.

However, there could be a chance that safe words are compromised. The Take Five to Stop Fraud campaign urges people to pause and take time to think if it could be a scam.

If in doubt, individuals should call a trusted friend or family member for a "sense check" or dial 159 to speak directly with their bank.

Many major UK banks can be reached through this number, including Bank of Scotland, Barclays, Co-operative Bank, First Direct, Halifax, HSBC, Lloyds, Metro Bank, Monzo, Nationwide Building Society, NatWest, Royal Bank of Scotland, Santander, Starling Bank, Tide, TSB and Ulster Bank.

If someone believes they've been scammed, they should immediately contact their bank or payment provider and the police.

Actor James Nesbitt, supporting Starling Bank's campaign, said: "I'll definitely be setting up a safe phrase with my own family and friends."

Grahame said: "It's more important than ever for people to be aware of these types of scams being perpetuated by fraudsters, and how to protect themselves and their loved ones from falling victim."

Lord Sir David Hanson, Minister of State at the Home Office with Responsibility for Fraud, added: "AI presents incredible opportunities for industry, society and governments, but we must stay alert to the dangers, including AI-enabled fraud.

"As part of our commitment to working with industry and other partners, we are delighted to support initiatives such as this through the Stop! Think Fraud campaign and provide the public with practical advice about how to stay protected from this appalling crime."

You may like