Voice clones or deepfakes have emerged as the latest tool for cyber scammers, they said, as artificial intelligence (AI)-related scams are being increasingly reported from different parts of the country.
Voice deepfaking started as an entertainment gig to mimic songs for Instagram reels on websites such as Covers.ai, Voicify.ai and VoiceFlip.ai but has spiralled into a larger problem with genuine AI startups such as ElevenLabs, Speechify, Respeecher, Narakeet, Murf.ai, Lovo.ai and Play.ht being weaponised in the hands of scammers.
More than a dozen websites have proliferated on the internet, offering free voice cloning options with accuracy as high as 95% in 29 languages and more than 50 accents. There are also professional voice cloning models which can mirror every intonation, rhythm and nuance.
“Deepfakes in general are quite dangerous, and particularly voice AI shall soon evolve into an organised phishing tool. For instance, job scams will now convert from a WhatsApp message to an actual HR voice calling you,” said Jaspreet Bindra, founder of Tech Whisperer.
“But, technological solutions, increased awareness and, of course, regulation shall also develop to keep a check on the proliferation of such scams.”
A voice clone is a synthetic audio created using generative AI tools which are trained on sample audio of a person. To create a clone, a source audio is needed which can be anything from an Instagram story