Vietnam.vn - Nền tảng quảng bá Việt Nam

The dangers of 'ChatGPT prescriptions'

ChatGPT – an artificial intelligence (AI) tool – is becoming a useful information search and retrieval tool for many people.

Báo Tuổi TrẻBáo Tuổi Trẻ30/12/2025

ChatGPT - Ảnh 1.

The dangers of looking up prescriptions and buying medicine via ChatGPT - Photo: AI

However, using this tool to prescribe medication, buy drugs, or follow instructions for treating illnesses carries potential health risks.

In many cases, GPT chat provides incorrect medication advice, wrong treatment directions, and inaccurate information about the illness.

AI prescriptions lack responsibility, the hands to examine patients, the empathy of medical ethics, and the ability to distinguish between emergency and critical illnesses. Therefore, let doctors be the ones to decide on treatment to protect your safety and that of your loved ones.
Doctor CHU DUC THANH

Beware of potential harm from "treating illnesses" using ChatGPT.

Experiencing nasal congestion, headache, and a dry cough for three days, Ms. L. (34 years old) asked ChatGPT: "I have a cough and nasal congestion, what medicine will help me recover quickly?". She received the advice: "Generally, a cold can be treated with symptom-relieving medication such as herbal cough medicine or short-term nasal spray," and Ms. L. followed the advice.

After 5 days of taking medication, her symptoms did not improve; instead, her headaches worsened, her cough increased at night, and she felt fatigued. Finally, she went to see an ENT doctor, who diagnosed her with acute sinusitis.

Meanwhile, a 38-year-old man in Hanoi was prescribed medication for erectile dysfunction by his doctor. Upon checking ChatGPT, he saw a warning that the medication could cause corneal complications. Worried about the side effects, he stopped taking the medication as prescribed, causing his condition to worsen.

When he returned to the hospital, his erectile dysfunction had become severe, requiring much longer and more expensive treatment than initially thought.

Many people were astonished when images circulated on social media showing people holding their phones, displaying a conversation with ChatGPT containing a list of medications for coughs, nasal congestion, runny noses, etc., and taking them to pharmacies to buy them. This list included antibiotics, which require a doctor's prescription.

The use of ChatGPT as a "virtual doctor" is not uncommon. Recently, some hospitals have warned about the use of ChatGPT for informational purposes, which leads to stroke patients being admitted late, missing the golden window for intervention.

Not long ago, a 60-year-old man in the US nearly lost his life after following advice from ChatGPT, replacing table salt (sodium chloride) with sodium bromide in his daily diet. He consumed sodium bromide for three months after ordering it online. After some time, his health deteriorated, forcing him to be hospitalized.

At the hospital, he even expressed suspicion that his neighbor had secretly poisoned him, making diagnosis difficult. After conducting tests and clinical monitoring, doctors determined he had "bromide poisoning".

Unforeseen health consequences

Dr. Ha Ngoc Manh, Deputy Director of the Vietnam-Belgium Andrology and Infertility Hospital, who directly treats patients with erectile dysfunction, shared that corneal complications are extremely rare.

In reality, when prescribing medication, doctors carefully consider and calculate individualized dosages to suit each patient's medical condition. Patients who take the medication at the correct dosage, under close medical supervision, will ensure safety and effectiveness. However, instead of consulting a specialist, this person stopped taking the medication on their own, worsening their condition.

"With its massive dataset, ChatGPT can provide almost instantaneous answers, addressing any field. However, many people are confusing the function of this tool with the role of a doctor. ChatGPT should only be considered as an initial source of information, and cannot replace medical examination, diagnosis, and treatment," said Dr. Manh.

This expert also stated that when patients have concerns about the side effects of medication, they should ask their doctor or pharmacist directly for thorough advice. They should also provide information about the signs and symptoms of side effects so that doctors can make appropriate adjustments, instead of relying on AI that could worsen their condition.

According to Dr. Chu Duc Thanh, from the Intensive Care and Toxicology Department of Hospital 19-8, doctors have encountered cases where, instead of urgently seeking emergency care, patients stayed home asking for advice online, listening to consultations from non-specialists, or self-medicating, leading to unfortunate consequences.

Dr. Thanh explained that basic medical diagnosis includes two main steps: subjective symptoms, which are what the patient describes (headache, fever, cough, shortness of breath, etc.). This is the initial data that AI can collect through conversation. And objective symptoms, which is the process of the doctor examining the patient's body directly (looking, touching, tapping, listening, etc.).

Dr. Thanh emphasized: "This is something that AI completely misses. AI is only listening to your complaints about illnesses through a screen, just like listening over the phone and guessing the illness. Not to mention that doctors need to thoroughly investigate your medical history, including underlying conditions, genetics, allergies, living circumstances, occupation, etc. Only through tests, functional examinations, or imaging diagnostics can an accurate diagnosis be made."

Only with an accurate diagnosis can treatment be effective and thorough. Regarding treatment, doctors must also weigh the benefits (of curing the disease) against the risks of side effects or harm. Many people may suffer from the same disease, but doctors need to adhere to the principle of individualization.

For example, a blood pressure medication that is effective for one person may be absolutely unsuitable for someone with a history of asthma, or a common antibiotic may need to be dosed or contraindicated for patients with kidney or liver failure. AI cannot automatically know and consider your own underlying medical conditions, leading to the risk of poisoning or worsening of pre-existing conditions. Therefore, using AI information for self-treatment can be harmful to one's health.

Dr. Thanh said that in many cases, patients with respiratory symptoms such as fever, cough, sore throat, hoarseness, and shortness of breath, if asked, the AI ​​will give diagnoses like sinusitis, pharyngitis, etc., and prescribe medication to treat the symptoms, even antibiotics and corticosteroids.

While the majority of seasonal causes are viral (70%), prescribing antibiotics is often unnecessary, leading to complicated antibiotic resistance. Without thorough medical examination, more serious conditions such as bronchitis and pneumonia can be easily overlooked.

Don't use "AI prescriptions."

Experts advise: Do not self-medicate, whether with "AI prescriptions" or online guides. Only use medication as prescribed and prescribed by your doctor. In emergency situations (loss of consciousness, sudden weakness or paralysis, chest pain, shortness of breath, prolonged high fever, vomiting blood… and any unprecedented and intense pain), ignore the AI ​​and call emergency services (115) or go straight to the hospital immediately.

Back to the topic
WILLOW

Source: https://tuoitre.vn/nguy-hiem-tu-don-thuoc-chatgpt-20251230080644576.htm


Comment (0)

Please leave a comment to share your feelings!

Same tag

Same category

Same author

Di sản

Figure

Enterprise

News

Political System

Destination

Product

Happy Vietnam
Cloud hunting in Dong Cao, Bac Giang

Cloud hunting in Dong Cao, Bac Giang

Peacetime train station

Peacetime train station

Enchanted

Enchanted