Meta Is Delaying The Release Of Its AI Tool Voicebox

Thursday, June 29 2023

Share this story:

Older woman holds a cordless phone
Wealth of Geeks

While voice scams have been going around for a while, Artificial Intelligence has made the game easier for scammers to win. In fact, social media giant Meta is delaying the release of its AI tool Voicebox since it carries too many risks for misuse.

Decrypt.co reports Meta has said Voicebox could be an easier way to create "deepfake" speech, which can be used in scams, or to spread hate speech and misinformation. According to Meta, Voicebox does not need to be trained on a specific speech generation task but can use raw audio clips to generate new speech.

No "Deepfakes," Please

The "deepfake" trend is as dangerous as it is popular, since it allows scammers or other bad actors to create realistic videos or audio tracks to deceive others more easily.

A security expert from VPNOverview said these scammers can use "voice cloning" AI tech to leave misleading voice mails or change their own voices. Meta's Voicebox could make that even easier.

Fraudsters will use AI tools to impersonate family members' voices, usually to have their victims send money to get their loved ones out of trouble. All the scammers need is a short clip of the person's voice to create a call that sounds genuine.

Is It Real or a "Deepfake"?

Most people would probably say they have no trouble distinguishing a loved one's voice from a scam, but is this the case? Not necessarily. McAfee surveyed 7,000 people worldwide and found as many as 70% of them weren't sure they could tell the difference between a cloned voice and the genuine article.

Cybercriminals create urgent-sounding messages to hook their victims, a spokesperson from McAfee says. They use scenarios saying a loved one has been injured, stranded, or robbed, for instance, and they need money immediately. Some may even go as far as saying the loved one has been arrested and needs bail money. The approach is to craft a message most likely to get an immediate, instinctive response.

McAfee reports 36% of those who said they lost money due to a voice scam had $500 to $3,000 stolen, while 7% reported thefts for as much as $15,000.

Seniors at Risk

Seniors are a group at high-risk for voice scams. A study at Baycrest Hospital in Toronto, Canada, found adults 60 and over were more likely to be unable to distinguish authentic voices from cloned ones. Participants aged 30 and under were better able to make the distinction.

Study author Björn Herrmann theorized older adults pay more attention to speech content, while younger adults tend to listen to its emotion and intonation. He said recognition of AI speech "relies on the processing of rhythm and intonation" rather than the words used, so older humans may not be as likely to recognize cloned speech when they hear it.

Don't Get Caught

There are precautions people can take so they don't get taken in by a voice scam. VPNOverview lists several tips to remember for anyone thinking they might be a scammer's next target.

First, don't answer unknown phone numbers. Most scammers will not use a familiar phone number, so a person is less likely to be scammed if they just don't answer.

Second, if the scammer leaves an urgent-sounding voice message, call the loved one back to see if they're really in trouble. Using social media may also be an effective way to verify whether someone needs your assistance. If they say there's no problem, keep declining the phone calls.

If a potential victim does answer an unknown number, the third tip is they should never give their names.

Fourth, always hang up if asked to select a number or call a number to stop receiving calls. This could identify someone as a potential scam target.

A fifth piece of advice: always verify a recipient's identity if a conversation about money is on the table, especially if it involves prepaid cards, wire transfers, or cryptocurrency. Those are major red flags.

Last, it's never a bad idea for family members to use a specific codeword or "safeword" in such an emergency to verify a legitimate call.

If a person thinks they've encountered an AI-generated voice scam, they should report it immediately to the Federal Trade Commission at ReportFraud.ftc.gov and local law enforcement.

© 2024 K-LOVE News

Share this story:

See All News