AI: New voice cloning technology allows scammers to impersonate anyone

AI: New voice cloning technology allows scammers to impersonate anyone

As artificial intelligence technology continues to advance, scammers are finding new ways to exploit it.

Voice cloning has emerged as a particularly dangerous tool, with scammers using it to imitate the voices of people their victims know and trust in order to deceive them into handling over money.

“People will soon be able to use tools like ChatGPT or even Bing and eventually Google, to create voices that sound very much like their voice, use their cadence,” said Marie Haynes, an artificial intelligence expert. “And will be very, very difficult to distinguish from an actual real live person.”

She warns that voice cloning will be a new tool for scammers who pretend to be someone else.

Carmi Levy, a technology analyst, explains that scammers can even spoof the phone numbers of family and friends, making it look like the call is actually coming from the person they are impersonating.

“Scammers are using increasingly sophisticated tools to convince us that when the phone rings it is in fact coming from that family member or that significant other. That person we know,” he says.

Levy advises people who receive suspicious calls to hang up and call the person they think is calling them directly.

“If you get a call and it sounds just a little bit off, the first thing you should do is say ‘Okay, thank you very much for letting me know. I’m going to call my grandson, my granddaughter, whoever it is that you’re telling me is in trouble directly.’ Then get off the phone and call them,” he advises.

Haynes also warns that voice cloning is just the beginning, with AI powerful enough to clone someone’s face as well.

“Soon, if I get a FaceTime call, how am I going to know that it’s legitimately somebody that I know,” she says. “Maybe it’s someone pretending to be that person.”

As this technology becomes more widespread, experts are urging people to be vigilantes and to verify calls from friends and family before sending any money.

“There are all sorts of tools that can take written word and create a voice out of it,” says Haynes. “We are soon going to be finding that scam calls are going to be really, really on the rise.”

As artificial intelligence technology continues to advance, scammers are finding new ways to exploit it. Voice cloning has emerged as a particularly dangerous tool, with scammers using it to imitate the voices of people their victims know and trust in order to deceive them into handling over money. “People will soon be able to use tools like ChatGPT or even Bing and eventually Google, to create voices that sound very much like their voice, use their cadence,” said Marie Haynes, an artificial intelligence expert. “And will be very, very difficult to distinguish from an actual real…