Skip to Content

This bank says ‘millions’ of people could be targeted by AI voice-cloning scams

By Anna Cooban, CNN

London (CNN) — “Millions” of people could fall victim to scams using artificial intelligence to clone their voices, a UK bank has warned.

Starling Bank, an online-only lender, said fraudsters are capable of using AI to replicate a person’s voice from just three seconds of audio found in, for example, a video the person has posted online. Scammers can then identify the person’s friends and family members and use the AI-cloned voice to stage a phone call to ask for money.

These types of scams have the potential to “catch millions out,” Starling Bank said in a press release Wednesday.

They have already affected hundreds. According to a survey of more than 3,000 adults that the bank conducted with Mortar Research last month, more than a quarter of respondents said they have been targeted by an AI voice-cloning scam in the past 12 months.

The survey also showed that 46% of respondents weren’t aware that such scams existed, and that 8% would send over as much money as requested by a friend or family member, even if they thought the call seemed strange.

“People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters,” Lisa Grahame, chief information security officer at Starling Bank, said in the press release.

The bank is encouraging people to agree a “safe phrase” with their loved ones — a simple, random phrase that’s easy to remember and different from their other passwords — that can be used to verify their identity over the phone.

The lender advises against sharing the safe phrase over text, which could make it easier for scammers to find out, but, if shared in this way, the message should be deleted once the other person has seen it.

As AI becomes increasingly adept at mimicking human voices, concerns are mounting about its potential to harm people by, for example, helping criminals access their bank accounts, and spread misinformation.

Earlier this year, OpenAI, the maker of generative AI chatbot ChatGPT, unveiled its voice replication tool, Voice Engine, but didn’t make it available to the public at that stage, citing the “potential for synthetic voice misuse.”

The-CNN-Wire
™ & © 2024 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Article Topic Follows: CNN - Money

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

KION 46 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content