Vietnam.vn - Nền tảng quảng bá Việt Nam

AI voice scams wreak havoc across the US

Báo Ninh ThuậnBáo Ninh Thuận12/06/2023

The voice on the phone sounded terrifyingly real. A mother heard her daughter sobbing before a man snatched the phone and demanded ransom.

But according to AFP news agency, in fact, the daughter's voice was a product of artificial intelligence (AI) and the kidnapping was completely unreal.

Faced with this fraud problem, experts say the biggest danger of AI is blurring the line between reality and fiction, giving cybercriminals a cheap yet effective technology to carry out their evil intentions.

In a new series of scams that are shaking up the US, scammers are using AI voice-cloning tools that are surprisingly realistic — yet easily accessible online — to steal money by impersonating family members of their victims.

“Help me, mom, help me,” Arizona resident Jennifer DeStefano heard a frantic voice on the other end of the line.

70% of respondents do not believe they can distinguish between AI voices and real human voices. Illustration: AI Magazine

DeStefano was 100 percent convinced her 15-year-old daughter had been in a skiing accident. “I never doubted it. That was her voice. That was the way she would cry. I never doubted it for a second,” DeStefano told a local television station in April.

The scammer placed the call, displaying an unknown number on Ms. DeStefano’s phone, demanding a ransom of up to $1 million. The AI-powered extortion scheme fell apart within minutes when Ms. DeStefano was able to contact her daughter.

But the horrific case highlights the potential for cybercriminals to abuse AI technologies to impersonate others.

“AI’s voice cloning capabilities are now almost indistinguishable from a real human voice, allowing scammers to extract information and money from victims more efficiently,” Wasim Khaled, CEO of Blackbird.AI, told AFP.

A simple search on the internet turns up a plethora of freely available apps that can generate AI voices using just a short, even a few seconds, sample recording. As a result, a person’s real voice can easily be stolen from the content they post online.

“With a small audio sample, AI voice copying can be used to send voicemails and voice texts. It can even be used as a live voice changer during phone calls,” Khaled noted.

Scammers may use different accents, genders, or even mimic the way their victims speak.

In a global survey of 7,000 people in nine countries, including the US, one in four said they had either been scammed using AI voice, or knew someone who had been scammed that way.

70% of respondents said they were not confident they could tell the difference between a cloned voice and a real one.

Meanwhile, US officials have recently repeatedly warned about the increasing frequency of scams targeting elderly grandparents. In a common scenario, the impersonator will pose as a victim’s grandchild in need of urgent money in a difficult situation.

“You get a call. There’s a panicked voice on the line. It’s your nephew. He says he’s in big trouble. He wrecked his car and is in jail. But you can help by sending him money,” the FTC describes the scam.

In the comments below the FTC's warning bulletin, many people said their elderly family members had been scammed in this way.

That's the story of 19-year-old Eddie from Chicago. Eddie's grandfather received a call from someone impersonating him, saying that he needed money because he caused a car accident.

The ruse was so convincing that Eddie's grandfather rushed to scrape together the money and even considered mortgaging his house to get the money before the lie was discovered, according to a report by McAfee Labs.

“Because it is so easy to create highly realistic voice clones, almost anyone with an online presence is vulnerable,” UC Berkeley School of Information Professor Hany Farid told AFP. “These scams are growing and spreading.”

In early 2023, AI startup ElevenLabs admitted that its voice transcription tool could be misused for malicious purposes. Specifically, a user of the company's product posted a fake audio clip showing actress Emma Watson reading the biography Mein Kampf of German Nazi leader Adolf Hitler.

We are moving so fast that you can't trust everything you see on the internet, so we need new technology to know whether the person you are talking to is really who they say they are, Gal Tal-Hochberg, chief technology officer at venture capital firm Team8, told AFP.

According to VNA/Tin Tuc Newspaper



Source link

Comment (0)

No data
No data

Heritage

Figure

Enterprise

No videos available

News

Political System

Destination

Product