Search
Close this search box.

SCAM ALERT: Scammers Now Using AI To Clone Voices For Fraudulent Phone Calls – Read How To Protect Yourself

(AP Photo)

The Federal Trade Commission has issued a warning about a new type of scam that could trick people into sending money to criminals. For years, scammers have pretended to be authority figures, like police officers, to demand money to help a friend or family member in trouble. Now, the FTC has warned that criminals are using voice cloning technology to impersonate loved ones and con people into sending them money.

According to the FTC, scammers only need a short audio clip of someone’s voice to generate a clone using artificial intelligence. When the scammer calls, they’ll sound just like the loved one, making it difficult for people to detect the fraud. The FTC has advised people to hang up and call the person directly to verify the story, particularly if they ask for payment via wire transfer, cryptocurrency, or a gift card.

Just this month, an Arizona mother received a call from an unknown number but it was her daughter who was crying for help. The scammer/attacker then got on and threatened to hurt her daughter if the mother didn’t hand over ransom money.

Fortunately, she had friends around who were able to confirm her daughter was safe within four minutes which made her realize it wasn’t actually her daughter on the phone with her. But the level of accuracy of AI voice clone really shook her. “It was completely her voice. It was her inflection. It was the way she would have cried,” she said. “I never doubted for one second it was her. That’s the freaky part that really got me to my core.”

The FTC has not provided an estimate of how many people have fallen victim to this type of scam, but it has highlighted some high-profile incidents. In 2019, scammers impersonated a UK-based energy firm CEO and demanded $243,000. In early 2020, a bank manager in Hong Kong was fooled by someone using voice-cloning technology into making hefty transfers. Earlier this year, at least eight senior citizens in Canada lost a combined $200,000 in an apparent voice-cloning scam.

Experts have warned that voice-cloning technology is advancing, and its cost is dropping, making it more accessible to scammers. “Before, it required a sophisticated operation. Now small-time crooks can use it,” said Subbarao Kambhampati, a professor of computer science at Arizona State University. With the rise of “deepfake” videos, which show celebrities doing and saying things they haven’t, it’s clear that scammers are becoming more sophisticated in their attempts to con people.

Artificial intelligence is no longer a far-fetched idea out of a sci-fi movie. We’re living with it, here and now. 

How to protect against AI voice clone scams

  1. Ask a challenge question or even two – something only your loved one would be able to answer (e.g. name your childhood stuffed animal, etc.)
    • Remember, don’t ask a question to which the answer could be found on social media, online, etc.
  2. If possible, have someone call or text the person directly that the scammer is claiming needs help
  3. Letting unknown numbers go to voicemail may help, but if the attackers are able to leave a voicemail with your loved one’s voice, it could sound real
  4. Set your social media profiles to private – many attackers look for voice samples from public social media profiles to generate the convincing AI voice clone
    • It’s believed as little as 3 seconds of someone’s voice is needed to create a realistic clone
  5. Don’t share your phone number on social media if possible

(YWN World Headquarters – NYC)



3 Responses

Leave a Reply


Popular Posts