Previously, when her child had symptoms like coughing, runny nose, and nasal congestion, Ms. NTH described the symptoms to ChatGPT and requested treatment suggestions from the platform. A few seconds later, "predictions" about the child's health condition, along with a series of care and treatment recommendations, including a list of suggested medications, were provided. These included medications containing corticosteroids – compounds that are very dangerous for young children.
Many doctors have reported encountering cases of patients with chronic diseases and worsening conditions after following ChatGPT's treatment protocols. A cardiologist at Quang Nam General Hospital shared that not long ago, he treated two young patients who misused the "AI doctor" and suffered negative consequences as their conditions deteriorated.
Dr. Nguyen Tam Thang, Deputy Director of Quang Nam General Hospital, said that ChatGPT should only be considered an initial tool to assist people in finding medical information, recognizing dangerous signs, or preparing questions before going for a check-up. Abusing AI for medical examinations and treatment is dangerous.
“To accurately determine the patient's condition and develop a treatment plan, doctors must carefully examine clinical symptoms, including functional and physical symptoms, order basic and paraclinical tests to assess organ function, identify the cause of the disease and any co-morbidities. Furthermore, doctors must also learn about the patient's medical history before making a definitive diagnosis and prescribing appropriate medication for each individual,” said Dr. Nguyen Tam Thang.
Recently, representatives from OpenAI stated that ChatGPT should not be used for critical medical decisions without human review. In fact, users can still ask ChatGPT about symptoms, medical concepts, or examination procedures, but it will not provide specific diagnoses or prescriptions. This helps to distinguish between " medical information" (which can be shared) and "medical advice" (which must be given by a doctor).
Currently, AI platforms operate based on data collected from multiple sources and lack comprehensive representation across various population groups. When data is not comprehensive, AI risks drawing incorrect conclusions. This is especially true for personalized information such as medical history and treatment plans, where AI cannot provide accurate predictions or advice.
Furthermore, drug lists suggested by AI or chatbot applications have absolutely no legal validity and cannot replace a valid prescription. Self-medication, including the purchase of antibiotics, contributes to the alarming problem of antibiotic resistance.
According to the World Health Organization (WHO), Vietnam is among the countries with the highest rates of antibiotic resistance in the Asia-Pacific region. The main reason is that a large number of patients self-medicate without a doctor's prescription.
From October 1st, the Ministry of Health requires all hospitals to implement electronic prescriptions. For other healthcare facilities, this will be implemented from January 1st, 2026. This is part of the roadmap outlined in Circular 26/2025 of the Ministry of Health, which regulates prescriptions and the prescribing of chemical and biological drugs in outpatient treatment at healthcare facilities. The Ministry of Health believes that when all prescriptions are updated synchronously, regulatory agencies can promptly detect and address drug abuse, incorrect prescribing practices, or the sale of drugs without a prescription.
Source: https://baodanang.vn/canh-giac-voi-bac-si-ai-3309646.html







Comment (0)