Vietnam.vn - Nền tảng quảng bá Việt Nam

Deepfake voice scams are becoming increasingly sophisticated; what should we do?

Deepfake voice technology allows people to imitate voices that are identical to real people, causing many people to fall into the trap of believing in familiar voices.

Báo Tuổi TrẻBáo Tuổi Trẻ09/07/2025

Deepfake Voice - Ảnh 1.

Scams involving voice imitation with deepfake voice

In the age of rapidly developing artificial intelligence, voice – once considered reliable evidence – has now become a dangerous tool in the hands of criminals. Deepfake voice technology allows for the creation of voices that are virtually indistinguishable from real people, enabling sophisticated scam calls aimed at defrauding and stealing assets.

Why are deepfake voices so scary?

Deepfake voice is a technology that uses artificial intelligence (AI) and machine learning to create a fake voice that sounds exactly like a real person's voice.

With the help of modern models like Tacotron, WaveNet, ElevenLabs, or voice cloning platforms like Reppeecher, criminals only need 3-10 seconds of voice sample to create a deepfake that is 95% believable.

Deepfake voices are particularly dangerous because of their ability to mimic speech almost perfectly, from pronunciation and intonation to even the unique speech patterns of each individual.

This makes it very difficult for victims to distinguish between real and fake voices, especially when the voice belongs to a relative, friend, or superior.

Exploiting voice recordings is also very easy, as nowadays most people expose their audio through platforms like TikTok, social media livestreams, podcasts, or online meetings. More worryingly, deepfake voices leave no visual trace like images or videos , making investigations difficult and victims easily lose money.

Lừa đảo bằng deepfake voice ngày càng tinh vi, phải làm sao? - Ảnh 2.

A deepfake can be created with just a few seconds of voice sample.

Deepfake voice scams are becoming increasingly sophisticated, often using a familiar script: impersonating an acquaintance in an emergency situation to create panic and pressure victims into transferring money immediately.

In Vietnam, there have been cases of mothers receiving calls from "their sons" claiming to have been in accidents and urgently needing money. In the UK, a company director was scammed out of over $240,000 after hearing his "boss" request a money transfer over the phone. An administrative employee was also tricked when he received a call from a "senior boss" requesting payment for a "strategic partner"...

The common thread in these situations is that the fake voice is reproduced to sound exactly like a relative or superior, leading the victim to believe it completely and giving them no time to verify.

Always verify, don't believe immediately.

Given the increase in deepfake voice scams, people are advised not to transfer money based solely on voice calls, even if the voice sounds exactly like a relative. Instead, call back the original number or verify the information through multiple channels before making any transactions.

Many experts also recommend setting up an "internal password" within the family or business for verification in unusual situations.

Furthermore, it's necessary to limit the uploading of videos with clear audio to social media, especially lengthy content. In particular, proactive warnings and guidance should be provided to vulnerable groups such as the elderly or those with limited access to technology, as these are prime targets for high-tech scams.

Deepfake Voice - Ảnh 3.

The voices of family, friends, and colleagues can all be faked.

In many countries, authorities have begun to tighten regulation of deepfake technology through their own legal frameworks.

In the US, several states have banned the use of deepfakes in election campaigns or for spreading misinformation. The European Union (EU) passed the AI ​​Act, requiring organizations to be transparent and provide clear warnings if content is created using artificial intelligence.

Meanwhile, in Vietnam, although there are no specific regulations for deepfake voices, related acts can be prosecuted under current law, with charges such as fraud, invasion of privacy, or identity impersonation.

However, the reality is that technology is developing at a pace far exceeding the law's monitoring capacity, leaving many loopholes that malicious actors can exploit.

When voice is no longer evidence

Voices used to be closely associated with and trusted evidence, but with deepfake voices, they are no longer reliable proof. In the age of AI, individuals need digital defense knowledge, proactive verification, and constant vigilance because a call could be a trap.

PHAN HAI DANG

Source: https://tuoitre.vn/lua-dao-bang-deepfake-voice-ngay-cang-tinh-vi-phai-lam-sao-20250709105303634.htm


Comment (0)

Please leave a comment to share your feelings!

Same tag

Same category

Christmas entertainment spot causing a stir among young people in Ho Chi Minh City with a 7m pine tree
What's in the 100m alley that's causing a stir at Christmas?
Overwhelmed by the super wedding held for 7 days and nights in Phu Quoc
Ancient Costume Parade: A Hundred Flowers Joy

Same author

Heritage

Figure

Enterprise

Don Den – Thai Nguyen's new 'sky balcony' attracts young cloud hunters

News

Political System

Destination

Product