Vietnam.vn - Nền tảng quảng bá Việt Nam

Security expert shows how to avoid Deepfake scams

Báo Hà TĩnhBáo Hà Tĩnh29/03/2023


According to security experts, scams using Deepfake technology are very dangerous, but there are ways for users to recognize videos using this artificial intelligence technology, thereby avoiding being fooled by bad guys.

The risk of a wave of fraud using Deepfake technology

In a warning issued on March 24, Tuyen Quang Provincial Police said that recently, a very sophisticated method of fraud and appropriation of property has appeared on cyberspace. These are scammers using Deepfake technology (technology that applies artificial intelligence - AI to create audio, image and video technology products that fake real-life subjects with very high accuracy) to make fake video calls to borrow money from relatives, pretend to be children studying abroad calling their parents to ask them to transfer money to pay tuition fees, or can create emergency situations such as relatives having an accident and needing urgent money for emergency treatment...

In recent days, police in some other localities such as Hanoi and Thai Nguyen have continued to warn about fraudulent tricks to appropriate assets on the internet using Deepfake technology.

Security expert shows how to avoid Deepfake scams

Police in several provinces and cities have issued warnings about fraud using Deepfake technology. (Illustration photo: Internet)

In a sharing at the workshop on "Cybersecurity in digital space - Trends and opportunities" organized by the Posts and Telecommunications Institute of Technology (PTIT) on March 27, expert Ngo Minh Hieu, co-founder of the ChongLuaDao.vn project and Threat Hunter, National Cyber ​​Security Monitoring Center said that Deepfake videos have just appeared in Vietnam, but in fact, this AI-based fraud and forgery technique has been applied by international criminal groups for about 2-3 years.

Expert Ngo Minh Hieu also analyzed that there are many tools on the internet to support subjects in creating Deepfake videos. In the future, when cybercriminals in Vietnam know more about how to steal videos and images, cut and paste them, then use online tools to create Deepfake - then there will be a wave of 4.0 generation fraud, causing many people who have long been difficult to fool to become victims.

“Middle-aged people and older are the most vulnerable to scams because they lack awareness of information security and technology and have difficulty recognizing this type of sophisticated scam,” expert Ngo Minh Hieu added.

Security expert shows how to avoid Deepfake scams

Expert Ngo Minh Hieu shared with PTIT students about forms of fraud, including fraud using Deepfake.

Discussing this issue, NCS Company Technical Director Vu Ngoc Son said that Deepfake applications are becoming more and more popular, even causing concerns about losing control. Initially, these applications were created with the purpose of "face swapping" to help users easily replace their faces and voices with characters in famous movies. It can be considered the next step of the trend of photo editing and humorous dubbing for clips that caused a fever in the past.

However, scammers have taken advantage of Deepfake applications to create clips with fraudulent and impersonating content. Many people have lost money because they thought their relatives, colleagues, and leaders were calling them to ask them to transfer money to them. This technology is being used for more bad purposes, so recently when mentioning Deepfake, people often think of bad tools.

How to avoid scams using Deepfake technology

Regarding prevention, expert Vu Ngoc Son analyzed that because the computing power of Deepfake applications has not really become perfect, clips made from this technology often have small capacity, short duration, and low sound and image quality. The most noticeable thing is that the face is quite stiff and less emotional. The body of the character in Deepfake will also move less, especially less face-up or face-down compared to normal clips, and there will be no actions of rubbing the face or covering the face because the AI ​​will handle errors when the face is partially covered. Therefore, if you pay close attention, you can detect it.

“Users should not trust clips that are short, low quality, blurry, unclear, have little facial expression, little body movement, faces that are not turned sideways, voices that are not smooth or too monotonous, and have no pauses,” expert Vu Ngoc Son recommends.

The expert from NCS Company also said that the most important thing is that users need to be vigilant. Specifically, do not trust if you receive short clips or video calls with poor image or sound quality; check again by calling directly with a regular phone; ask the caller to put their hand on their face or turn left, turn right; talk as much as possible with the caller to make sure the call is real and not talking to a pre-recorded video. In particular, users should not transfer money or send information to strange addresses or accounts.

Sharing the same opinion, expert Ngo Minh Hieu recommends that to avoid being scammed, it is best for users to be alert and vigilant. When someone on social networks, in your friends list, suddenly asks to borrow money or sends a strange link, do not rush, but stay calm, check and verify again.

“Users can proactively authenticate by calling directly or facetime for at least 1 minute, then pretend to ask personal questions that only you and the other person know. Because Deepfake will not be able to fake a real conversation in real time with high accuracy,” expert Ngo Minh Hieu advised.

According to VNN



Source

Comment (0)

No data
No data

Heritage

Figure

Enterprise

No videos available

News

Political System

Destination

Product