Artificial Intelligence (AI) voice cloning tools have become the latest weapon of choice for cybercriminals, according to a recent study by McAfee Labs.
With a three-second audio sample, fraudsters can now convincingly mimic almost anyone's voice to send deceptive voicemail or text messages.
Scammers commonly impersonate someone known to the victim, creating urgent and distressing scenarios to manipulate the victim into sending money.
Alarmingly, these tactics have proved effective, with 77% of those who received such messages reporting financial losses.
What’s more concerning is that scammers are now recording AI-generated voice clones over real news footage to lead people to investment scam websites.
In Australia, Scamwatch is warning the public of AI-cloned videos of public figures and celebrities mongering fraudulent investment opportunities.
Scammers are recording AI-generated voice clones over real news footage to lead people to investment scam websites. Don’t be lured by online videos featuring celebrities or public figures claiming you’ll make guaranteed money on an investment opportunity. pic.twitter.com/bNVlsI7KKl— ACCC Scamwatch (@Scamwatch_gov) May 25, 2023
Reported losses
The losses reported by victims varied widely with 36% of respondents who had lost money to scams reporting losses between US$500 and US$3,000.
However, a small percentage, 7% reported significant losses of between US$5,000 and US$15,000.
In many instances, cybercriminals are exploiting readily available voice data to perpetrate these scams.
The study found that 53% of adults shared their voice data online or in recorded notes at least once a week.
Easy accessibility to cloning tools
During the study, McAfee researchers found more than a dozen AI voice cloning tools freely available on the internet.
With a basic level of expertise, these tools can create convincing voice clones, raising concerns about the potential for increased fraudulent activity.
Despite the challenges posed by more distinctive voices, the researchers found that they could replicate various global accents with considerable accuracy.
The study warns of the potential for increasingly sophisticated cybercrime as AI technology continues to evolve.
How do I protect myself?
In light of these findings, the public is urged to take precautions, including establishing a secret codeword with close family and friends.
Scrutinising the source of all calls and messages, exercising caution when sharing information on social media, using identity monitoring services and removing personal information from data broker sites are a must