Site icon IT World Canada

Beware of scammers using AI-generated voice clones, FTC warns

The Federal Trade Commission (FTC) issued a consumer alert warning people to be vigilant for calls using voice clones in a new type of phone scam. The scam involves criminals using artificial intelligence-generated voice clones to impersonate friends and family members in order to swindle people out of money.

The scam begins with a call from someone who sounds like a friend or family member, urgently asking for money to help get them out of trouble. The scammers use a short audio clip of the person’s voice, which they can obtain from content posted online, and a voice-cloning program to create a clone of the person’s voice. When the scammer calls, they sound just like the victim’s loved one.

The FTC recommends that if someone who sounds like a friend or relative asks for money, particularly if they want to be paid via a wire transfer, cryptocurrency, or a gift card, people should hang up and call the person directly to verify their story.

The agency couldn’t provide an estimate of the number of reports of people who’ve been ripped off by thieves using voice-cloning technology. However, the incidents involving this type of scam are increasing, and the financial losses are significant.

The sources for this piece include an article in NPR.

Exit mobile version