Article

Stay Safe from AI Voice Cloning Scams

As artificial intelligence (AI) technology continues to advance, so do the tactics of scammers looking to exploit it. One of the latest scams to emerge is AI voice cloning, where fraudsters use AI to mimic a person’s voice.

woman holding phone and cup of coffee

Scammers know that when potential victims believe they are speaking with someone whom they actually know and trust, they’re less likely to verify the details of requests for money or sensitive financial data. 

Here’s what you should know about this increasingly prevalent scam and what you can do to protect yourself from becoming a victim.

A Typical AI Voice Cloning Scam Scenario

(Ring! Ring!)

Scammer (AI Voice Clone): Hey Peggy, it’s your cousin Ralph. Something urgent has come up. I hate asking for help, but I’m in a bind.

Peggy: Oh, hey Ralph! Are you alright? You sound strange.

Scammer: Well, I took a spontaneous trip abroad, and I’ve found myself in a bit of a mess. My wallet was stolen. I’m stranded, and I have no money and no ID.

Peggy: That’s awful!

Scammer: The embassy is taking forever to help me, and I really need your help to get out of this situation. So, I was hoping you could overnight me some gift cards to cover my hotel and help me eat until the embassy sorts things out.

Peggy: How much do you need?

Scammer: Can you send me $1,500 in Visa gift cards? Most places accept them around here.

 Peggy: Ouch! That’s a lot of money.

Scammer: I know, but things are scary here. I promise I’ll pay you back as soon as I get home. Will you help me?

Peggy: Um, OK.

Scammer: Thank you, Peggy! You’re a lifesaver! Here are the details of where to send the gift cards.

Warning Signs of an AI Voice Cloning Scam

AI voice cloning technology has advanced rapidly, enabling scammers to imitate voices with remarkable precision. However, if you remain vigilant, you’ll pick up on red flags that can tip you off that your caller just might be a potential scammer. These red flags include:

  • Appeals to emotions. Be cautious when a caller attempts to incite panic or urgency, asserting that they’re in danger or require immediate assistance.
     
  • Requesting unusual methods of financial help. Be wary of callers who insist on only accepting financial help through methods difficult to trace, like gift cards or cryptocurrencies, as this is a common indicator of a scam.
     
  • Sound issues. Sometimes low-quality AI voice cloning software can lead to sound quality problems, such as static, echoing, or fluctuating volume levels. Listen closely to the voice for irregularities—unnatural pauses or unusual speech patterns. Even advanced AI voice cloning technology might carry traces of minor imperfections.
     
  • Wild stories. Scammers often concoct elaborate narratives to convince you to comply with their requests. If the situation seems far-fetched or implausible, trust your intuition that a scammer may be at work.

Protect Yourself from an AI Voice Cloning Scam

Shield yourself from AI voice scammers by creating a challenge question, code word, or phrase that you and your loved ones can use to quickly confirm emergency situations. This is often the best defense since you will be able to immediately identify whether the caller is a scammer.

If you receive an urgent call before setting up your challenge question or code word or phrase, remember to slow your reaction during the call. Scammers will try to pressure you into making quick decisions. When a caller tries to rush you, slow the conversation pace in order to gather more information before taking any action.

And if you’re unsure about a call, tell the caller that you’ll call them back momentarily. Then, instead of redialing the number that called you, use a known phone number to reach the person the caller claimed to be. This way, you can verify whether you were actually speaking with that person.

 

If you suspect a scam, report it to the Federal Trade Commission.