The topic “ChatGPT causing mental disorders” on Reddit – the world’s most popular forum – is attracting a lot of attention. The author of the post, @Zestyclementinejuce – a 27-year-old female teacher – said ChatGPT convinced her husband that he was the “next messiah” and had the answer to all the problems of the universe.
Although it has only been around for 7 days, members have left more than 1,300 comments below, sharing their own experiences with OpenAI's chatbot.
In particular, many opinions said that AI makes their friends and relatives become paranoid when they think they are chosen for sacred missions, or that cosmic powers do not exist. These only exacerbate mental health problems on an unprecedented scale without the supervision of experts or managers.

A 41-year-old woman told Rolling Stone magazine that her marriage ended abruptly after her husband began engaging in unbalanced, conspiracy-filled conversations with ChatGPT. When they met in court earlier this year to finalize their divorce, the husband kept mentioning “soap-on-food conspiracy theories” and believed he was being watched.
“He was sensitive to the messages and cried when he read them out loud,” the woman said. “The messages were crazy and full of spiritual terms.” In them, the AI called her husband a “child from the stars” and a “river walker.”
“The whole thing is like the movie 'Black Mirror,'” the wife said. Black Mirror is a Netflix series about humans being controlled and shaped by technology in the future.
Others said their partner talked about “light, darkness and war,” or “ChatGPT brought up teleportation machine blueprints, the kind of sci- fi stuff you only see in movies.”
A man says his wife changed everything to become a spiritual advisor and does strange things to people.
OpenAI did not respond to Rolling Stones’ questions. However, the company had to pull an update to ChatGPT earlier this year after users noticed that the chatbot became overly flattering and approving, reinforcing delusional beliefs. Nate Sharadin, a researcher at the Center for AI Safety, points out that these AI-induced hallucinations could be the result of a person with a certain belief suddenly being able to converse with a partner (in this case, an AI) who is always present and shares that belief.
“I have schizophrenia despite long-term treatment,” one Redditor wrote. “What I don’t like about ChatGPT is that if I go into a psychotic state, it will continue to validate my thoughts” because the chatbot doesn’t think or recognize what is wrong.
The AI chatbot also acts like a therapist, except it has no basis in being a real human counselor. Instead, it leads people into unhealthy, meaningless conversations.
Erin Westgate, a psychologist and researcher at the University of Florida, said AI is different from a therapist because it doesn’t have human interests at heart. “A good therapist doesn’t encourage clients to believe in the supernatural. Instead, they try to steer clients away from unhealthy things. ChatGPT doesn’t have those constraints or concerns,” she said.
In the Rolling Stone article, a man with a history of poor mental health began using ChatGPT to help with his programming. However, it gradually drifted into mystical themes, leading him to wonder “if I was being paranoid.”
(According to Rolling Stone, Futurism)

Source: https://vietnamnet.vn/tan-nat-gia-dinh-vi-chatgpt-2398344.html
Comment (0)