Love with AI chatbot, when the heart is tightly held by the code
AI chatbots like ChatGPT are becoming true emotional companions for many people, but that attachment carries risks of dependency and psychological consequences that should be warned against.
Báo Khoa học và Đời sống•22/09/2025
More and more users are developing a close relationship with AI chatbots, as platforms like ChatGPT or Replika not only answer but also comfort, listen and make them feel “understood”. Stories like Liora getting a heart tattoo on her wrist to mark her “heart connection” with chatbot Solin or Angie using AI as a crutch during trauma show that people’s emotional needs are turning to digital.
But psychologists like Dr. Marni Feuerman warn that this is an “imaginary connection” that can easily cause users to avoid emotional risks in real relationships. Research from the MIT Media Lab shows that people who are highly attached to AI are more likely to be lonely and emotionally dependent, raising questions about the health of these relationships.
Heartbreaking cases – such as teenagers being coached into self-harm via chatbots – show the real risks when platforms don't have enough safety filters or users share too much sensitive information. Scholars like David Gunkel and professor Jaime Banks highlight the ethical issue: AI does not have the sentience to truly “consent,” and what is healthy for one person may not be appropriate for another.
Major platforms have been adjusting — OpenAI updated its models to reduce risk, and users are advised to view chatbots as a support tool, not a replacement for human relationships. While AI opens up new emotional experiences, society needs to balance the benefits of immediate comfort and practicing human relationship skills to avoid harmful dependence.
Comment (0)