After targeting politicians in the UK, Slovakia and other parts of the world, audio scammers have finally made their splash in the US. They cloned President Joe Biden’s voice and turned it into a robocall, a campaign tactic that goes back to the 1970s. Welcome to 2024, where the political voice you hear on the phone might have been conjured on the internet.
The automated phone message alarmed election experts when it went out over the weekend, playing a voice edited to sound just like Biden, and telling New Hampshire residents not to vote in Tuesday’s Democratic primary. “Save your vote for the November election," it said, before tacking on a Biden catchphrase: “What a bunch of malarkey." Misinformation researchers are rightly worried about so-called audio deepfakes emerging at the start of a big election year, when roughly half the world will be casting a ballot. While fake videos and pictures are eye-catching and dramatic, fake audio clips are more dangerous.
Think of them as the mosquito of misinformation. They’re small and easy to produce, tough to spot and almost impossible to track. And they can spread false information to disastrous effect.
Last year, for instance, a Slovakian political party may well have lost a national election because an audio deepfake of its leader went viral two days before the vote. Governments are well aware of the problem. Biden himself signed an executive order late last year that tries to steer how AI is developed without putting the public at risk.
But the genie is already out of the bottle. There are dozens of companies offering tools to clone any voice, including your own or someone else’s, with some more strict about fakes than others. A British AI company Synthesia, for instance,
. Read more on livemint.com