Who can you trust in our AI-driven world of deepfakes and automatically-generated messages? “Deepfake” is short for “deep learning and fake,” and some of these projects feel so real that even experts might be fooled.
One scam on our radar takes the word “deceptive” to a new level. Instead of receiving a call from a human phone scammer, criminals are mimicking the voices of your loved ones, demanding financial assistance.
Keep reading to discover how thieves incorporate artificial intelligence (AI) into their scams.
Phone scammers using AI technology
One senior couple in Saskatchewan received an alarming call from what sounded like their grandson, Brandon. As reported by the Washington Post, the voice on the phone was asking desperately for bail money. According to the call, Brandon was in jail and needed help quickly.
The two scrambled to gather the funds from their savings and almost made good on their promise. Thankfully, a manager at their local bank caught wind of the story. He warned them that scammers were making the rounds and tricking innocent people into sending cash.
The calls weren’t just clever, generic impersonations of loved ones like Brandon. While phone-based imposter scams of the past did make use of professional voice actors to convince victims that their friends or family were in trouble, this new wave of fake calls was created using artificial intelligence. How?
If you’re familiar with the deepfake technology used to mimic celebrities, this tale will feel hauntingly familiar. With only seconds of a target’s recorded voice, these criminals can leverage machine learning to craft conversations about anything. The common denominator is the deepfake is always asking for money.
Schemes like this are challenging enough for professional analysts to detect when there’s a visual element involved. Without a video, the task becomes nearly impossible.
The technology has been used in scams like this since as early as 2019. Since then, threat actors have stolen millions of dollars from corporate ventures and ordinary people like you.
Once the scammer has enough information about your social circle, they can choose from a catalog of AI voices and tailor them to the recorded audio they’re referencing. You might be surprised by how close they’re able to get. Often, there is no way to recoup your financial losses after sending the money.
How to tell when an AI-generated call is fake
Convinced you’ll be able to spot a deepfake if one were to call you? Think again. If you get an unexpected call from a loved one asking for money, we recommend thinking twice before handing it over.
First, double-check the number calling you. In this story, Brandon claimed to be calling from a police station, which might be enough to pull the wool over some victims’ eyes.
Try to call your loved one’s phone number immediately after getting a frantic call like this. If they don’t answer, contact the police or the point of contact the caller mentions.
Audio phone calls aren’t the only way into your wallet, either. Tap or click here for another take on a similar venture that could scam you out of thousands.