
When I told the story to my colleague Loan, I received the following response: "That's a common trend among young people these days. My eldest daughter asks for advice from her AI friend, Mimi, on everything from choosing clothes to resolving misunderstandings with her best friend. She says the AI chatbot can play the role of a best friend, lover, or confidante. In just a few seconds, it can answer any question, and the answers are incredibly subtle, based on the questioner's emotions and mood. When she's sad, it knows how to comfort and encourage her; when she's happy, it knows how to congratulate her. The AI even knows how to sweetly thank and apologize. So when talking to the AI, she feels listened to, whereas when talking to her parents, she sometimes gets scolded or doesn't get an answer. My daughter even said some people have fallen in love with AIs."
Ms. Lien replied sadly: "No matter how intelligent AI becomes, it's still created by humans. How can it replace humans? AI doesn't have emotions or consciousness; it's improved and developed according to human needs, so it can't fully understand human sadness, joy, or happiness. AI certainly can't replace the moments of closeness, understanding, and memories that humans create together."
Her voice softened as Loan said, "It's true that since my child started interacting with AI, I feel much more distant from her. There's no more of the cheerful chatter we shared after school, no more hearing her hundreds of 'why' questions every day. Perhaps, parents themselves need to change to be more involved and listen to their children more."
Today, artificial intelligence offers countless benefits, helping people work and learn faster and more efficiently, and connecting them to global knowledge. However, dependence on AI, with chatbot software and tools like ChatGPT and Gemini, is becoming increasingly common among young people. Many use ChatGPT and other AI tools to confide in others, replacing face-to-face communication. In reality, ChatGPT and AI tools can only assist in listening and suggesting solutions; they cannot replace real human relationships. AI applications often collect large amounts of personal data to optimize responses. When users share private stories and innermost thoughts, this data can be used for malicious purposes or make users victims of sophisticated scams. Therefore, instead of withdrawing from real life to immerse ourselves in cyberspace, we need to increase social interaction and learn to use AI as a supporting tool, not as a human being or close friend. When feeling stuck or experiencing psychological problems, we need to seek help from family, friends, or mental health professionals for "healing." Within families and schools, parents and teachers need to work together to guide and direct children and students to use AI safely and healthily, preventing AI tools from leading or manipulating their psychology.
Source: https://baohungyen.vn/lam-ban-voi-ai-3191338.html






Comment (0)