By now, many of us know that tax bureaus, auto warranty companies and the like won’t call us with urgent fines or fees we must pay in the form of prepaid cards. Yet, the nearly $300 million fine against a massive transnational robocalling operation by the Federal Communications Commission shows just how widespread this issue has become.
But what about when the voice of someone you know is on the other line — your CEO, spouse, or grandkid — urgently requesting money to help get them out of a pickle?
With the insidious use of generative artificial intelligence mimicking the voice of someone you know and communicating with you in real-time, that call becomes inherently untrustworthy.
The phone system was one built on trust, says Jonathan Nelson, director of product management at telephony analytics and software company Hiya Inc. “We used to be able to assume that if your phone rang, there was a physical copper wire that we could follow all the way between those two points, and that disappeared,” Nelson said. “But the trust that it implied didn’t.”
Now, the only call that you can trust is the one that individuals initiate. But with a quarter of all non-contact calls reported as spam — meaning fraudulent or simply a nuisance — according to Hiya’s Global Call Threat Report for Q2 2023, that’s a lot of verification.
A report on AI and cybersecurity from digital security company McAfee says that 52% of Americans share their voice online, which gives scammers the main ingredient for creating a digitally generated version of your voice to victimize people you know. This is called an interactive voice response (IVR) and it’s used in a type of spam called voice phishing or “vishing.” While spear phishing once took a lot of time and money, Nelson said, “generative AI can kind of take what used to be a really specialized spam attack and make it much more commonplace.”
According to McAfee’s CTO Steve Grobman, these types of calls are bound to remain less likely than other, more obvious spam calls, at least for the time being. However, “they’re putting the victim in a more tenuous situation where they’re more likely to act, which is why it’s important to be prepared,” Grobman said.
Spotting AI scams
That preparation depends on a combination of consumer education and the war between technologies, or more specifically, white-hat AI fighting black-hat AI.
Companies like McAfee and Hiya are on the front lines of this fight, spotting AI scam patterns (such as historical call patterns that function similar to a credit history for phone numbers) and finding ways to obstruct them.
Despite the fact that the U.S. federal government spearheaded the IRS scam investigation (refer to the 2023 podcast Chameleon: Scam Likely for an inhalable deep dive into the logistics of the investigation), its response to AI technology’s augmentation of robocalling is disorganized, one expert says.
Kristofor Healey is a former special agent for the Department of Homeland Security who now works in the private sector as CEO of Black Bear Security Consultants. He spent his time in the federal government investigating large scale money laundering organizations and led the team that took down the IRS scam, the largest telefraud case in U.S. history.
Healey says the government and law enforcement are inherently reactive systems, but that AI as a tool for businesses such as call centers (“whether they are good call centers or bad call centers,” he said), are going to multiply the cases that must be reacted upon.
Educating people about deepfake audio spam calls
Ultimately, technology can only be so proactive because cybercriminals always take things to the next level. Business and consumer education is the only truly proactive approach available, experts say, and it requires getting the word out about how people can protect themselves and those around them.
For businesses, this may mean incorporating education on deepfake audio spam calls as part of required employee cybersecurity training. For individuals, it could mean being more discerning about what you post online. Grobman said, “Sometimes risky behavior will have a higher likelihood of impacting someone around you than impacting you directly.” Criminals could use what we post on social media in an AI-generated voice-cloned call as a way to gain rapport with other victims.
Meanwhile, identity protection and personal data cleanup services will continue to be useful for consumers. Policies around how employees must behave when receiving a non-contact call and what they share online — even on their personal profiles — could become increasingly commonplace.
Grobman recommends that families come up with a duress word or validation word that they can use to ensure it’s really a loved one on the other line. It’s like a spoken password; much like digital passwords, avoid using the name of your pets or children, or any information that is readily available.
What if someone calls stating they’re from a company? Hang up, look up the company’s contact information (don’t just call back the number that called you), and call it yourself for verification. “It’s incredibly important to validate independently through a trusted channel,” Grobman said.
For his part, Healey acts as a sort of telefraud vigilante, always picking up the phone when a spam number shows up on the screen. He doesn’t give them any confirming information, nor tells them who he is or any information about himself. He simply keeps them on the line as long as possible, costing them money as their voice-over-IP technology is at work.
“Keeping them on the phone is an effective way to prevent them from harming someone else,” said Healey.
The widespread IRS scam that Healey investigated and the podcast Chameleon: Scam Likely covered had tangible implications on victims — shame, loss of financial security, loss of relationships, even loss of life. To the trained ear, spam calls can sound silly, but people like the elderly or those who are in vulnerable states of mind have fallen, and continue to fall, for the charade.
With the use of AI technology mimicking the voices of our acquaintances, friends or loved ones, the game becomes more ingrained in the psyche. And it is a game. At some point, Chameleon notes, it ceases to be about the money, but rather the achievement, adrenaline, and power. But while education on this ever-evolving threat makes its rounds, technology helps to fight back.