AI ‘voice clone’ scams increasingly hitting elderly Americans, senators warn
Generative artificial intelligence systems are already making it easier for scammers to con elderly Americans out of their money, and several senators are asking the Biden administration to step in and protect people from this quickly emerging threat.
Sen. Mike Braun, R-Ind., the top Republican on the Senate Special Committee on Aging, spearheaded a bipartisan letter to the Federal Trade Commission (FTC) on Thursday that asks for an update on what the agency knows about AI-drive scams against the elderly and what it is doing to protect people. The letter, signed by every member of the Senate committee from both parties, asks about AI-powered technology that can be used to replicate people’s voices.
The letter to FTC Chairwoman Lina Khan warned that voice clones and chatbots are allowing scammers to trick the elderly into making them believe they are talking to a relative or close friend, which leaves them vulnerable to theft.
‘In one case, a scammer used this approach to convince an older couple that the scammer was their grandson in desperate need of money to make bail, and the couple almost lost $9,400 before a bank official alerted them to the potential fraud,’ the Senate letter said. ‘Similarly, in Arizona, a scammer posing as a kidnapper used voice-cloning technology to duplicate the sounds of a mother’s crying daughter and demand ransom.’
In an interview with Fox News Digital, Braun said ‘imposter’ scams lead to about $2.6 billion in losses every year, and he said the elderly are particularly at risk now that scammers have access to voice-clone technology.
‘We’re getting calls into our constituent services line back in Indiana already where this is coming in and happening to some extent,’ Braun said. He added that imposter scams can be done without using a fake voice but warned that ‘AI makes it even easier because it’s like talking to your grandkid.’
Braun’s staff said they have also heard a complaint about a scam that used a voice that sounded like movie and pop star Jennifer Lopez. Braun recalled a Senate hearing this week in which Sen. Richard Blumenthal, D-Conn., opened the hearing on AI with an AI-generated voice that sounded like him, reading off an AI-generated script, and said scammers have access to these same tools.
‘When you can replicate a voice to the extent I couldn’t tell if that was Sen. Blumenthal or a replication – it sounded exactly like him – just imagine,’ Braun said. ‘That is a tool that the scammers never had.’
The FTC has made it clear it will use its authority to protect consumers from AI to the extent it can as Washington policymakers look to expand their regulatory oversight of this new technology. The Senate letter to the agency suggested that the FTC update its ‘educational and awareness’ materials to help seniors understand that scammers may be looking to fleece them out of their money using AI-generated voices.
Braun said FTC efforts to create these sorts of public service messages is a good start, adding that the Senate Special Committee on Aging maintains a hotline on scams against the elderly that he expects to soon start hearing complaints about voice-clone technology. He said the reports collected by the committee could feed into legislative efforts.
Braun predicted that Washington is likely to take up more regulatory efforts in the future on AI, and it didn’t go unnoticed that OpenAI CEO Sam Altman and other industry officials who testified on AI this week seemed to be inviting more federal oversight.
‘I’ve never seen any new technology, new business, where the people that created it have been more worried about how you use it,’ he said. ‘They’re worried that if they’re going to get any monetary value out of it, they are going to have to make sure it’s well-regulated.’
‘I just think there’s no way that AI can go unchecked, and I’m glad to see the people … on the forefront are thinking the same way,’ he said.