
Published March 17, 2023 five:02 p.m. ET
Artificial intelligence professional Marie Haynes says AI tools will quickly make it challenging to distinguish AI from a true person’s voice. (Dave Charbonneau/CTV News Ottawa)
As artificial intelligence technologies continues to advance, scammers are acquiring new methods to exploit it.
Voice cloning has emerged as a specifically harmful tool, with scammers employing it to imitate the voices of persons their victims know and trust in order to deceive them into handing more than income.
“Individuals will quickly be in a position to use tools like ChatGPT or even Bing and ultimately Google, to generate voices that sound really substantially like their voice, use their cadence,” stated Marie Haynes, an artificial intelligence professional. “And will be really, really challenging to distinguish from an actual true reside particular person.”
She warns that voice cloning will be the new tool for scammers who pretend to be somebody else.
Carmi Levy, a technologies analyst, explains that scammers can even spoof the telephone numbers of household and close friends, generating it appear like the get in touch with is in fact coming from the particular person they are impersonating.
“Scammers are employing increasingly sophisticated tools to convince us that when the telephone rings it is in truth coming from that household member or that important other. That particular person that we know,” he says.
Levy advises persons who get suspicious calls to hang up and get in touch with the particular person they feel is calling them straight.
“If you get a get in touch with and it sounds just a tiny bit off, the very first issue you really should do is say ‘Okay, thank you really substantially for letting me know. I am going to get in touch with my grandson, my granddaughter, whoever it is that you are telling me is in problems straight.’ Then get off the telephone and get in touch with them,” he advises.
Haynes also warns that voice cloning is just the starting, with AI strong adequate to clone someone’s face as effectively.
“Quickly, if I get a FaceTime get in touch with, how am I going to know that it really is legitimately somebody that I know,” she says. “Possibly it really is somebody pretending to be that particular person.”
As this technologies becomes far more widespread, authorities are urging persons to be vigilant and to confirm calls from close friends and household prior to sending any income.
“There are all sorts of tools that can take written word and generate a voice out of it,” says Haynes. “We are quickly going to be acquiring that scam calls are going to be truly, truly on the rise.”