NBC News on June 12 quoted an FBI warning saying that scammers - especially high-tech criminals - are using artificial intelligence (AI) to create pornographic videos or fake photos of victims, then asking them to pay so that they will not be spread on the internet.
The method is called deepfake - a technique that combines deep learning and machine learning algorithms to create realistic fake videos, images or sounds. This tool has become popular in recent years and is growing.
Illustration: AZ Business Magazine
In a June 12 warning, the FBI said it has received increasing complaints about criminals using deepfake tools to create fake pornographic videos and photos for blackmail.
Criminals take photos and videos of victims on social media. After creating fake products, they spread them on social media, public forums or pornographic websites.
According to the FBI, victims are forced to pay money, gift cards, or even “sex” photos or have their fake videos and photos distributed. At the same time, scammers threaten to send them to the victim’s family and friends if they don’t comply.
An NBC News investigation in March found that deepfake porn products were easily accessible through online search and chat platforms.
The president of the Identity Theft Resource Center (ITRC), a nonprofit that helps victims of fraud, says criminals sometimes try to shock victims with particularly sensitive videos or photos. After the victim panics, they will simply want the video or photo to disappear, which means giving in to the scammers' demands.
The FBI advises people to be cautious when accepting friend requests from strangers and to understand that responding to deepfake criminals' requests does not mean they won't distribute sexually explicit videos or images of them.
The National Center for Missing and Exploited Children has offered a free service called Take It Down to help prevent the distribution of sensitive videos or photos of children under 18.
Source
Comment (0)