If a call sounds like your boss (asking for bank account numbers) or your family member (begging for help in an emergency), you’re more likely to act. That’s why scammers use voice cloning to make their requests for money or information more believable.
We’ve all heard the stories where a grandparent gets a call from their grandchild saying that they’re stranded somewhere, in trouble or in jail and they need you to send some money so they can get home. Or an employee gets a call from the boss asking that he take money to buy gift cards.
The Federal Trade Commission is fighting back!
When the FTC announced its Voice Cloning Challenge last year, the main goal was to encourage innovative ways to help protect people from AI-enabled voice cloning scams. Last week, the FTC announced that they’ve awarded four top prizes to the winning submissions that take a wide range of approaches to doing just that:
1: A solution that would use algorithms to detect whether voice patterns are human or synthetic
2: A technology that would detect in real time voice cloning and deep fakes in incoming phone calls or digital audio in two-second chunks, assigning a “liveness score”
3: A proposal that would watermark audio with distortions that people would not be able to hear, but could throw off AI voice cloners so that the audio could not be accurately cloned.
4: A technology that would authenticate that a voice is human and embed the authentication as a type of watermark
You can learn more about the winning proposals on the Voice Cloning Challenge page. https://www.ftc.gov/news-events/contests/ftc-voice-cloning-challenge
The Voice Cloning Challenge is a part of the FTC’s ongoing work to ensure voice cloning technology isn’t used by scammers to cause harm. That work includes prevention of misuse where possible, a proposed comprehensive ban on impersonation fraud, and applying the Telemarketing Sales Rule to AI-enabled scam calls. It also includes warning consumers about the use of AI in scams — like when a scammer clones a family member’s voice, calls pretending to be in trouble, and then asks you to send money right away.
So where do the scammers get the voice samples, they use to create their deepfakes? It can come from a video posted on social media, but these scammers also record phone calls or hack into files that companies use to store their customers’ and employees’ voice prints for biometric authentication. Since many of the technologies today only require a few seconds of audio to re-create real people’s voices, the samples are relatively easy to get.
I should also point out that not all voice cloning applications are used for scamming. There are far more cases where the technology is being used for good. Dubbing, translations, intelligent voice assistants, screen readers and other assistive technologies, toys, robotics, video games, learning assistants, and so many other use cases. This technology has even been used to preserve the voices of people who are at risk of losing the ability to speak due to various diseases and to preserve dying languages.
AI voice cloning is a growing concern, especially as technology becomes more sophisticated. Protecting oneself from AI voice cloning scams involves a combination of awareness, caution, and implementing security measures.
Here are some steps individuals can take:
Be Skeptical: Always be cautious when receiving unsolicited phone calls or messages, especially if they request sensitive information or financial transactions.
Verify Caller Identity: If you receive a call from someone claiming to be from a legitimate organization, ask for their contact information and verify it independently through official channels before providing any personal information.
Don’t Share Personal Information: Refrain from sharing personal or sensitive information over the phone unless you are absolutely sure of the caller’s identity and legitimacy.
Use Two-Factor Authentication (2FA): Enable two-factor authentication whenever possible, especially for sensitive accounts such as email, banking, and social media. This adds an extra layer of security beyond a mere voice verification.
Be Wary of Urgent Requests: Scammers often create a sense of urgency to pressure victims into acting quickly without thinking. Take your time to assess the situation and verify the legitimacy of the request.
Voice Biometrics: Some companies offer voice biometric solutions that can help detect if a voice is being cloned or manipulated. Consider using such technologies if available.
Report Suspicious Activity: If you suspect you have been targeted by an AI voice cloning scam, report it to the appropriate authorities, such as your local law enforcement agency or consumer protection agency. Report it to the FTC at www.ReportFraud.ftc.gov
Update Privacy Settings: Regularly review and update your privacy settings on social media platforms and other online accounts to minimize the amount of personal information available to potential scammers.
Stay Informed: Keep yourself updated on the latest advancements in AI technology, especially in voice synthesis and cloning, to understand the potential risks and how to mitigate them.
By being vigilant, skeptical, and informed, individuals can reduce the likelihood of falling victim to AI voice cloning scams.
If you get a call like this, call the person who supposedly contacted you using a phone number you know is theirs, and verify the story. If you can’t reach your loved one, try to get in touch with them through another family member or their friends.
Thanks to the FTC.gov, www.speechtechmag.com, Microsoft CoPilot, and Chat GPT.
And a SPECIAL thanks to Pam Snell for putting this on my radar!
Deliver David's Tech Talk to my inbox
We'll send David's weekly Tech Talk to your inbox - including the MP3 of the actual radio spot. You'll never miss a valuable tip again!