HomeTechnologyScammers are conning people by digitally copying the voices of their loved...

Scammers are conning people by digitally copying the voices of their loved ones

Criminals use software to reproduce a voice thanks to artificial intelligence, in order to scam people by posing as someone close to them over the phone.

Scammers use artificial intelligence to scam people by replicating the voice of a loved one on the phone. According to a Washington Post article on Sunday, a 70-year-old couple in Canada received a call from a man whose voice sounded like his grandson’s, informing them that he was in prison and needed bail money.

Neither one nor two, the grandparents withdrew 3000 Canadian dollars in a first bank, then they went to a second establishment to withdraw more. It was then that the bank manager informed them that another customer had already received a similar call, and it turned out that the voice on the other end of the line had been false. The couple then realized that they were about to be scammed.

evaporated silver

Another couple was less fortunate. Two parents received a call from a lawyer telling them that their son had killed a US diplomat in a car accident, was in prison and needed money for legal fees. The lawyer then called his son on the phone to tell her that he needed C$21,000.

Although the call was unusual, the parents, convinced that they had traded with their son, complied and withdrew the money from several banks. They transmitted it to the lawyer through a bitcoin terminal. But the couple were later confused when they received a genuine call from their son, telling them everything was fine. Despite alerting the police, the parents were unable to recover their money.

A sample of 30 seconds is enough

According to the Washington Post, artificial intelligence makes it very easy for scammers to reproduce a person’s voice from an audio sample of just a few sentences. A 30 second snippet is enough to clone a voice. Scammers can make you say what they type on the keyboard. Most of the time they use it to make people believe that their loved ones are in danger.

Justice and law enforcement remain powerless against this type of scam. Most victims have few leads to identify the perpetrator and it is difficult for police to trace calls. As for the courts, there are very few legal precedents to hold the companies that develop these tools accountable for their use, explains the American newspaper.

Among these companies, ElevenLabs, a speech synthesis startup founded in 2022, has notably been used to make celebrities say anything, such as Emma Watson reciting passages from Adolf Hitler’s “Mein Kampf.” In late January, the company announced that it would strengthen safeguards against abuse, notably by launching a tool to detect an AI-generated voice.

Author: Marius Boquet
Source: BFM TV

Stay Connected
16,985FansLike
2,458FollowersFollow
61,453SubscribersSubscribe
Must Read
Related News

LEAVE A REPLY

Please enter your comment!
Please enter your name here