Loneliness has never been more serious. In addition to the mental health damage, it has become a real threat to security.

Taking advantage of the loneliness of their prey, cybercriminals are deploying one of the most dangerous scams today: love scams.

As the process becomes more professional and fueled by modern technology, love scams can be carried out on a massive scale.

Attackers build relationships and trust with targets through dating apps or social media. AI chatbots are used to create scenarios and situations in many different languages.

With the single population constantly increasing, researchers believe that automation technology will give scammers wings.

fbi scam
Romance scam victims lose hundreds of millions of dollars in the US alone. Photo: FBI

These scams are becoming more organized, according to Fangzhou Wang, an assistant professor of cybercrime research at the University of Texas.

They recruit from all over the world , targeting all kinds of victims. Dating apps and social networks have become fertile ground for scammers.

In the US, victims of romance scams reported losses of nearly $4.5 billion over the past 10 years, according to an analysis of the FBI's annual cybercrime reports.

In the five years to the end of 2023, romance fraud will cause losses of about $600 million each year, rising to nearly $1 billion in 2021.

Romance scams all take place online, with criminals sending Facebook messages to hundreds of potential victims at once, or matching with every profile they find on dating apps.

While criminals operate everywhere, from Yahoo Boys in West Africa to scam farms in Southeast Asia, they all follow a playbook for creating emotional attachments with their victims.

Elisabeth Carter, Associate Professor of Criminology at Kingston University London, calls romance scams the “most devastating” scam a person can encounter.

Online dating has become a daily occurrence in modern society. According to assistant Wang, she has seen evidence of scammers using generative AI to generate content for online profiles.

Some gangs in Southeast Asia have written AI tools for their scams. In October 2024, a report published by the United Nations found that organized criminals were “writing personalized scripts to trick victims while communicating in real time in hundreds of languages.”

According to Google, phishing emails sent to businesses are being written using AI. The FBI also notes that AI allows cybercriminals to message victims more quickly.

Cybercriminals use a range of manipulative tactics to lure victims into a trap and build romantic relationships. These include asking intimate questions that only a soul mate would ask, such as about dating history or relationships.

They also create intimacy through the “love bombing” technique, expressing passionate feelings to speed up the process. As the relationship progresses, they will often call the victim boyfriend, girlfriend, wife, husband...

Associate Professor Carter emphasized that the core tactic used by romance scammers is to play the role of vulnerable, unfortunate people. Sometimes they even admit to having been scammed and are wary of trusting others, creating the impression that they are not scammers.

This is helpful when it comes to the money scamming stage. They will explain that they are having money problems with their business, then disappear and come back a few weeks later.

The victim may want to help and take the initiative to reach out to send money. The scammer will initially disagree and convince the victim not to send him money, all just to manipulate the victim psychologically.

The language of a romance scammer is quite similar to that of a domestic abuser, according to Carter.

In many cases, the perpetrators have successfully preyed on people struggling with loneliness, according to Brian Mason, a police officer in Alberta, Canada.

When working with fraud victims, it is difficult to convince them that the person they are talking to does not love them.

In one case, the victim even contacted the scammer again, transferring money just to see his photo because she was lonely. In late 2023, the World Health Organization declared high levels of loneliness a threat to people's health.

Stigma and shame are key factors that make it difficult for victims to accept the reality they are experiencing. Carter notes that attackers exploit this psychology by telling victims that they should not reveal the conversation to others because the relationship is too special and no one will understand.

Maintaining a secret relationship, combined with other tricks to trick victims into transferring money instead of asking for it from them, makes it difficult for even the most cautious people to realize that they are being manipulated.

According to Carter, victims not only lose a lot of money, but they are also deceived by the people they love and trust the most. “Just because it’s online, just because it’s completely fake, doesn’t mean they don’t have real feelings,” she said.

(Synthetic)