Scammers use AI to clone loved ones’ voices

By Kate Ready, Jackson Hole Daily Via Wyoming News Exchange
Posted 6/22/23

JACKSON — A new tool is available to scammers: your voice.

The Federal Trade Commission has warned that “artificial intelligence voice scams” are on the rise nationally.

This item is available in full to subscribers.

Please log in to continue

E-mail
Password
Log in

Scammers use AI to clone loved ones’ voices

Posted

JACKSON — A new tool is available to scammers: your voice.

The Federal Trade Commission has warned that “artificial intelligence voice scams” are on the rise nationally.

Scammers are able to grab audio files of your voice from social media and upload them into programs that can replicate how you sound and make “you” say anything they wish.

A common call has been dubbed the “grandparent scam” or “family emergency scam,” where parents, but more commonly, grandparents, receive calls from their loved ones’ own phone number. They then hear their loved one sobbing before an unknown voice demands a ransom.

Their child’s sobbing voice, however, was generated through AI “cloning” technology.

In a global survey of 7,000 people from nine countries, including the United States, 1 in 4 people reported experiencing an AI voice cloning scam or knew someone who had.

The survey was published in May by the U.S.-based McAfee Labs.

The FTC is cautioning people not to trust their ears as fraudsters are easily able to mimic the voice of your loved ones using widely accessible online tools.

“You get a call. There’s a panicked voice on the line. It’s your grandson. He says he’s in deep trouble — he wrecked the car and landed in jail. But you can help by sending money,” the FTC warned in March.

Other common “imposter” scams entail fraudsters posing as local law enforcement or government agencies. According to the FTC, last year people lost a total of $2.6 billion to imposter scams.

The FTC cautioned people not to trust the voice on the other end. Hang up the phone and “verify the story.”

“Call the person who supposedly contacted you,” the FTC said in a statement. “Use a phone number you know is theirs. If you can’t reach your loved one, try to get in touch with them through another family member or their friends.”

If you are asked to provide cash, cryptocurrency or gift card numbers in a call like this, it’s highly likely that it is a fraudulent call, the agency said.

Some families are establishing a “safe word” that allows them to verify whether a call is genuine.

The FTC is warning that on a broader level, society must confront the rise of misleading “synthetic media” that may be used to cause widespread harm.

“A growing percentage of what we’re looking at is not authentic, and it’s getting more difficult to tell the difference,” the FTC said in a statement. “And just as these AI tools are becoming more advanced, they’re also becoming easier to access and use.”

ElevenLabs, a U.S.-based AI start-up, admitted to misuse of its Prime Voice AI tool in January after users of an online bulletin board website posted fake audio clips of celebrities, including Joe Rogan, James Cameron, Joe Biden and Emma Watson spewing racial slurs and making outrageous comments.

In the cloned voice clip of Emma Watson, the actress was reading Adolf Hitler’s manifesto, “Mein Kampf.”

Local law enforcement said they hadn’t received reports of this happening locally.

The practice has been dubbed as “deepfake,” which involves a video of a person in which their body or voice has been digitally altered so that they appear to be someone else; it is typically used maliciously or to spread false information.

If you spot a scam, report it to the FTC at ReportFraud.ftc.gov.

Comments