AI is being used to 'supercharge' phone scams with the cloned voices of your relatives

Ai speaking.
(Image credit: Getty Images - Ole_CNX)

Ever received an email from someone claiming to be your long-lost relative, stranded without their wallet and passport during a vacation, and begging for $1,000 to make it back home? We all know it's a classic scam. But let's be honest, even the savviest among us can fall for something like that, and AI is making it even tougher to tell when someone's trying to take us for a ride. The FTC (Federal Trade Commission) is now giving us a heads-up about a fresh con game where scammers actually call unsuspecting victims, pretending to be their own family members and using AI tools to make it more convincing.

Federal Trade Commission Chair Lina Khan (via Bloomberg) said at a recent event that we "need to be vigilant early" as AI tools develop, because we've already seen how these AI tools can "supercharge" fraud.

Khan was talking about crooks grabbing snippets of people's voices from social media and training AI to mimic them. They use AI-powered text-to-speech software to feed its lines and make it sound like their targets' distressed relatives. It's basically the same technique that those mischievous 4chan users used to make AI-generated voice clips of celebrities saying all sorts of terrible things. The scam typically involves the relative asking for money to get home or claiming that they are in jail and need bail money—anything to get the financial information of a sympathetic relative. 

Khan is concerned that further development of voice-mimicking technology could lead to a surge in scams and other harmful activities, affecting things like civil rights, fair competition, consumer protection, and equal opportunity.

Your next machine

(Image credit: Future)

Best gaming PC: The top pre-built machines from the pros
Best gaming laptop: Perfect notebooks for mobile gaming

According to the FTC, there were 5,100 reports of phone scams in 2022 alone, resulting in around $11 million in losses. But get this, that's just a tiny fraction of the whopping $8.8 billion lost to fraud overall, which is a 30% increase from the previous year. Scammers are really on a roll.

Bloomberg notes that the US has laws against imposter scams, which already apply to situations like this. However, that's not slowing down scammers very much, and the tools they use in schemes like this are easily accessible and pretty straightforward to operate. Thankfully, reporting fraud and trying to recover stolen funds have become somewhat easier, though it can still be a bit of a nightmare.

In a joint statement with the Department of Justice on AI from April, Khan writes, "Technological advances can deliver critical innovation—but claims of innovation must not be cover for lawbreaking." She continues, "There is no AI exemption to the laws on the books, and the FTC will vigorously enforce the law to combat unfair or deceptive practices or unfair methods of competition.”

TOPICS
Jorge Jimenez
Hardware writer, Human Pop-Tart

Jorge is a hardware writer from the enchanted lands of New Jersey. When he's not filling the office with the smell of Pop-Tarts, he's reviewing all sorts of gaming hardware, from laptops with the latest mobile GPUs to gaming chairs with built-in back massagers. He's been covering games and tech for over ten years and has written for Dualshockers, WCCFtech, Tom's Guide, and a bunch of other places on the world wide web.