The topic “ChatGPT causes mental disorders” on Reddit – the world’s most popular forum – is attracting a lot of attention. The author of the post, @Zestyclementinejuce – a 27-year-old female teacher – said that ChatGPT convinced her husband that he was the “next messiah” and had the answer to all the problems of the universe.
Although it has only been around for 7 days, members have left more than 1,300 comments below, sharing their own experiences with OpenAI's chatbot.
In particular, many people say that AI makes their friends and relatives become paranoid, thinking that they are chosen for sacred missions, or that cosmic powers do not exist. These things only exacerbate mental health problems on an unprecedented scale without the supervision of experts or managers.

Speaking to Rolling Stone , a 41-year-old woman said her marriage came to a violent end after her husband began engaging in unbalanced, conspiracy-filled conversations with ChatGPT. When they met in court earlier this year to finalize their divorce, he kept mentioning “soap-on-food conspiracy theories” and believed he was being watched.
“He was sensitive to the messages and cried when he read them out loud,” the woman said. “The messages were crazy and full of spiritual terminology.” In them, the AI called her husband a “child from the stars” and a “river walker.”
“The whole thing is like the movie ‘Black Mirror,’” the wife said. Black Mirror is a Netflix series about humans being controlled and shaped by technology in the future.
Others said their partners talked about “light, darkness, and war,” or “ChatGPT brought up teleportation machine blueprints, sci-fi stuff you only see in movies.”
A man says his wife changed everything to become a spiritual advisor and does strange things to people.
OpenAI did not respond to Rolling Stone’s inquiries. However, the company previously pulled an update to ChatGPT after users found the chatbot to be overly flattering and affirmative, reinforcing delusional beliefs. Nate Sharadin, a researcher at the Center for AI Safety, points out that AI-induced delusions could be the result of a person with a certain belief suddenly being able to converse with a partner (in this case, an AI) who is always present and shares that belief.
“I have schizophrenia despite long-term treatment. What I don’t like about ChatGPT is that if I’m in a psychotic state, it will continue to validate my thoughts,” writes one Redditor, because the chatbot doesn’t think or recognize what’s wrong.
The AI chatbot also acts like a therapist, except it has no basis in being a real human counselor. Instead, it leads people deeper into unhealthy, meaningless conversations.
Erin Westgate, a psychologist and researcher at the University of Florida, said AI is different from a therapist because it does not have human interests at heart. “A good therapist does not encourage clients to believe in supernatural forces. Instead, they try to steer clients away from unhealthy practices. ChatGPT has no such constraints or concerns,” she said.
In the Rolling Stone article, a man with a history of poor mental health began using ChatGPT to help with his programming. But it gradually drifted into mystical themes, leading him to wonder “if I was being paranoid.”
(According to Rolling Stone, Futurism)

Source: https://vietnamnet.vn/tan-nat-gia-dinh-vi-chatgpt-2398344.html
Comment (0)