Bad guys can use AI to combine sensitive photos and videos to defame and blackmail others - Photo: AI drawing
In the past, when talking about the technique of cutting and pasting one person's face onto another person's body, people often used Photoshop or specialized photo editing software - which was laborious and easily recognizable to the naked eye.
Now with AI tools, people can do that very easily after "a few notes" but it is not easy to detect with the naked eye.
Face-swapping sex video for blackmail
Due to work connections, Mr. H. (director of a business in Ho Chi Minh City) often interacts on social networks.
Once, he was befriended by a young girl who asked about his work. After a while of asking about each other, chatting, and sharing about work and personal life, the two became quite close.
There were also many text messages between the two with very emotional words, along with pictures sent back and forth, and video calls to see each other.
One day, Mr. H. suddenly received a call from a man claiming to be the girl's husband. After a "preemptive" attack on Mr. H., the "husband" demanded that he transfer compensation money if he did not want to have photos of chats and sex videos between the two sent to relatives and partners.
Then Mr. H. was shown a sex video of him and the other girl by his "husband", along with a video of the call between the two, and captured intimate text messages...
"I have never met that girl so I am sure the sex video is a photoshopped version" - Mr. H. affirmed. However, the image of the girl's face in the sex video and the girl in his video call are the same, along with a photo of the affectionate text message, forcing Mr. H. to transfer money to his "husband" many times to preserve his reputation.
Through Tuoi Tre 's investigation, Mr. H. was scammed in an organized manner according to a perfect scenario. In this scam, Mr. H. was fooled twice by deepfake. The first time was a video call and the second time was a sex video.
Notably, in the sex video, Mr. H.'s face was so skillfully grafted that viewers found it hard to believe it was a face graft video. An AI expert confirmed to Tuoi Tre that the bad guys used a new AI tool that could graft faces onto videos perfectly... like real ones.
Face-swapping AI tools abound
There are currently many AI application tools that have features such as creating any person with the desired face, and merging the desired face onto another person's body.
Among them, there are tools with features that serve bad purposes such as turning normal photos into nude photos by "stripping off" the person in the photo, or replacing a person's face in videos with a desired face...
Most of these AI tools are available for trial, but paying for them will give you full features with very fast transformation time.
For example, with the feature of merging faces onto other people's bodies, users just need to upload two corresponding photos and wait a few seconds for the results.
Or with the feature of creating nude photos from normal photos, the AI tool can do it very quickly, in just a few dozen seconds, making viewers startled.
Even the feature of replacing faces in sex videos, the AI tool does it very quickly, taking less than a minute for videos that are a few seconds long.
Speaking with Tuoi Tre , Mr. Nguyen Kim Tho, head of the research and development team of VNetwork cybersecurity company, analyzed the term deepfake used to refer to fake videos and images created by AI to make viewers think they are real.
Deepfake technology uses deep learning models such as neural networks to analyze a person's facial and voice data and create fake content that looks exactly like that person.
"Thanks to powerful algorithms, bad guys can put the victim's face into sensitive videos (eg "bed" videos, porn) or edit images to create fake nude photos.
Deepfake software and applications are now very popular on the Internet, even with mobile applications, open source code or free online services - making it easy for anyone to access tools to create fake videos and photos, so people need to be vigilant," said Mr. Tho.
Stay away from "sex chat"
Many scams start with making friends online and then tricking the victim into "sex chat" or sending sensitive photos. Therefore, users should never send private photos or videos to people they only meet online, no matter how much they promise or threaten.
Always remember that any content shared (even via private messages) can be recorded (screenshot, video) and then edited and used against you.
If you video call, be wary of strangers asking for sensitive content - it could be a fake screen or they're recording.
Cybersecurity experts advise users to never trust anyone just through online contact, and to verify the other person's identity and intentions before sharing anything sensitive.
Besides, if you receive a message or call threatening to release "hot" photos for blackmail, users do not need to be afraid and rush to transfer money.
Paying does not guarantee that the bad guys will delete the video; they may continue to demand more or post the content online anyway, experts say.
Instead of following the criminals' requests, users should collect evidence (messages, phone numbers, contact accounts, threatening content...) and immediately report to the nearest police agency or report the incident via the Ministry of Public Security 's VNeID application for timely support.
How to identify?
According to Mr. Nguyen Kim Tho, distinguishing between real images and videos and deepfakes is becoming increasingly difficult, but there are still some identifying signs and supporting tools.
Users can observe manually, because deepfake content sometimes has visual and audio abnormalities.
For example, composite images may show noise and color differences between the composite face and the body. Fake videos may have audio-video mismatches (lip movements that don't match speech) or facial expressions that appear stiff and unnatural.
Previously, some deepfake videos made the characters not blink, or the lighting and shadows on the face did not match the context - which were signs that the video was edited.
Although technology is improving (new deepfakes, for example, have added realistic eye blinking movements), observant viewers can still spot some illogical details in videos and photos.
Researchers are now developing many algorithms to automatically detect deepfake traces.
AI-generated content often leaves a distinctive “digital fingerprint” in each pixel that machines can recognize.
For example, Intel has introduced the first real-time deepfake detector, capable of analyzing videos and determining whether the characters in them are real people or AI-generated.
Additionally, some websites allow users to upload videos and photos to check the authenticity of the images (for example, Deepware, Sensity AI, etc.). These systems are constantly updated to keep up with new deepfake techniques.
In particular, users can check the source of the post and the context when encountering a sensitive video or image because many deepfake videos are spread through fake accounts or bots on social networks.
"If sensitive content about a person comes from an unofficial source or anonymous account, be skeptical about its authenticity. You may want to try contacting the person featured in the content directly to verify that they actually did it.
In addition, projects such as are promoting the embedding of authentication information (digital signatures) into photos and videos right from the moment of creation, helping to distinguish original content from edited content. In the future, users can use these authentication signs to identify trustworthy content," said Mr. Tho.
5 tips to protect yourself
1. Limit sharing of sensitive, private photos and videos (especially nude photos, family photos, children).
2. Set social media accounts to private (share only with trusted people).
3. Do not provide too much personal information (full name, phone number, address...) on public social networks.
4. Set strong passwords and enable two-step verification for your accounts to avoid being hacked.
5. Regularly search for your name and image on Google (or use reverse image search tools like Google Image, TinEye) to see if any of your photos have been posted without permission, and promptly request their removal.
Source: https://tuoitre.vn/dung-ai-ghep-hinh-anh-nhay-cam-de-tong-tien-20250317075948373.htm
Comment (0)