April 17, 2023

Last Updated on January 4, 2024

Scammers are stealing millions from businesses and individuals by using AI to “clone” the familiar voices of coworkers and loved ones to request money over the phone.
As the deepfake AI technology improves and becomes more widespread the cost to use it is dropping, making it more accessible to criminals. Anyone can now deepfake audio, video, and/or still images using a freeware application with no programming necessary.


Attacks are on the rise

The US Federal Trade Commission (FTC) just issued a consumer alert on the rise of family emergency call schemes that use voice cloning. According to the FTC, “All the scammer needs is a short audio clip of your family member’s voice—which he could get from content posted online—and a voice-cloning program. When the scammer calls you, he’ll sound just like your loved one.”

Or your CFO asking you to wire money to an offshore account. The cloned voice of a top customer duped a Hong Kong bank out of $35 million in an elaborate deepfake scam in 2021. Businesses also need to be alert for fraudsters using deepfake video to interview for remote work positions that could set them up to perpetrate insider attacks.


What you can do

Whether it’s allegedly a relative, friend, or member of your company who’s asking, it’s wise to verify who you’re communicating with before sending money. Especially if the caller asks for a wire transfer, or wants cryptocurrency or a gift card, you should end the call and contact the person directly.

Be alert also for sketchy phone calls from strangers whose sole purpose may be to record your voice. You might likewise consider removing video and/or audio recordings of your voice or those of your company executives from sites like LinkedIn, Facebook, or YouTube.


What’s next?

Companies scammed with deepfake AI voice cloning can face major, immediate financial losses that threaten their viability and damage their reputation and brand image.

To block deepfake scams and other social engineering attacks, it’s good practice to institute and enforce controls that require confirmation before sending money. To raise awareness about deepfake scams, include deepfake voice cloning scenarios as part of your security awareness training.

To speak with an expert about strategies to elevate your organization’s cybersecurity posture and stay safe from deepfake scams, business email compromise, ransomware, and other prevalent attacks on your employees and data, contact Pivot Point Security.

Interested in a checklist to see how ready you are for an ISO 27001 certification audit?

It's a little more complicated than just checking off a few boxes.
To learn more, download our ISO 27001 Un-Checklist now!