Nguyen Thu Huong, 21 years old, a student at a university in Hanoi , has a habit of searching for everything online, from exercises, beauty tips to diets. Recently, Huong often has pain in the epigastric region, feels nauseous at night, and sometimes feels nauseous. Instead of going to the hospital, the female student opens the ChatGPT application to "ask the AI doctor".
AI analyzes symptom descriptions and concludes “may have” gastritis mild due to stress.” Huong was advised to change her diet, go to bed early, limit coffee, and use some over-the-counter antacids that ChatGPT listed. She found those drugs online and took them regularly for two weeks.
The initial pain subsided, and Huong became more confident that “AI is as good as a doctor”. In the third week, she began vomiting blood mixed with fluid, had cramps in her upper abdomen, and broke out in cold sweat. Her family took her to the emergency room in a state of severe digestive blood loss.
Gastroscopy showed that the patient had duodenal ulcer progression, deep ulcers, bleeding due to incorrect use of antacids and untreated primary cause - Helicobacter pylori infection. If the patient arrives at the hospital a few hours late, he may go into hemorrhagic shock.
After nearly a week of intensive treatment, Huong was finally out of danger. Lying on the hospital bed, she shared: " I thought AI was smart and spoke convincingly like a real doctor, so I believed it. Now I understand that AI cannot replace actual diagnosis."

Le Hoang Long, 27, a programmer in Ho Chi Minh City, often works until 2-3 am. After months of insomnia, he felt stressed, tired, and had a fast heartbeat. Not wanting to see a psychiatrist because he was “afraid of being labeled as sick,” Long confided in ChatGPT and asked for suggestions on how to overcome the problem.
AI advised him to try “natural methods” such as drinking herbal tea, taking melatonin supplements, meditating before bed, and “if necessary, combining some Chinese sedatives.” Long followed these instructions, and also ordered an “imported herbal sleeping pill” product whose origin AI could not verify.
After two weeks, Long was able to sleep better, but began to have jaundice, loss of appetite, and fatigue. Thinking it was due to “body cleansing,” he continued drinking. Only when his urine turned dark and his skin became a deep yellow did he go to the hospital for a checkup.
Test results showed liver enzymes increased 10 times higher than normal, the doctor diagnosed hepatitis due to toxicity from unknown medicinal ingredients. Long had to be hospitalized for IV fluids, antidotes, and continuous liver function monitoring.
After the incident, Long admitted: “AI helps me learn knowledge, but it doesn’t know who I am or how serious my illness is. I have sacrificed my health because of blind faith in technology.”
According to MSc. Dr. Pham Ngoc Ha, Department of Obstetrics and Gynecology, Thanh Nhan Hospital, the worrying thing is not artificial intelligence (AI), but the way people perceive and use it. AI is essentially just a powerful tool that helps synthesize, organize and interpret information from many sources, helping us understand problems faster and more deeply.
However, AI has no eyes to observe, no hands to touch the wound, and no clinical experience to recognize abnormalities in the patient's gaze or voice.
In medicine, diagnosis is a multi-layered journey. The doctor begins by listening to the patient’s symptoms, observing the manifestations, examining by looking, touching, percussing, and listening, comparing the results of tests, imaging, and medical history. Every treatment decision is a combination of scientific data and experience accumulated over time. No algorithm can fully simulate that.
To put it simply, AI is like a sharp knife. In the hands of a chef, it is a useful tool to create a delicate dish; but if given to a child, the risk of cutting the hand is inevitable. The problem is not the knife, but the user.
Therefore, AI is not wrong, the mistake is that people expect too much from its capabilities. Artificial intelligence is not a doctor, and cannot replace human observation, diagnosis and judgment. From a medical perspective, AI should only be seen as a giant dictionary where users can look up and refer to, but should not place absolute trust when it comes to human health and life.

Dr. Truong Huu Khanh, former Head of the Department of Infectious Diseases and Neurology, Children's Hospital 1 (HCMC), commented that in medicine, AI cannot replace the role of clinical and human. AI can support many fields, but in the medical field, the experience, sensitivity and heart of a doctor are things that machines cannot copy.
Each patient needs to be examined directly and comprehensively, not just based on a few symptoms described online to diagnose. The doctor must observe, palpate, tap, listen, ask carefully about the patient's medical history and evaluate the general condition to be able to make an accurate conclusion.
Dr. Khanh believes that AI or "internet doctors" can be considered a source of information, helping people understand more about health and disease prevention. However, they cannot replace real doctors.
AI only provides general information and cannot assess internal organ damage, medical history, or drug reactions of each individual. Self-diagnosis and treatment based on “chatbot suggestions” pose many risks, especially for chronic diseases.
ChatGPT is opening up a new approach to health education , but should be seen as an information assistant, not a decision maker, because medicine still requires human insight and clinical experience.
Source: https://baolangson.vn/tin-chatgpt-chua-benh-nhieu-nguoi-tre-dang-danh-cuoc-suc-khoe-5063604.html






Comment (0)