The virtual assistant Lumi greets visitors at the museum lobby. Photo: HXH
Four minutes with Lumi
“Hello! I’m Lumi, the virtual assistant at the Da Nang Museum.” A device that resembles a half-robot, half-electronic display board introduces itself as it waits for visitors at the entrance to the Da Nang Museum.
I was quite curious about Lumi (let's call her that based on her name and voice) and decided to play the role of a tourist to ask this virtual assistant for help. It was a morning in early April, when the Da Nang Museum officially opened after four years of construction at the former Governor's Palace on the Han River.
“Please follow me so as not to obstruct my view. I’ll lead you to the next point: the ticket counter,” Lumi said before the screen rotated and the device rolled away. Stopping at the ticket counter, she suggested I could buy tickets via the automated kiosk or directly on my phone, but Lumi didn’t forget to mention the special offer, that the museum was currently offering free admission…
“Would you like to learn more about this area? If you have any questions or would like to stop and take a closer look, please press the button below.” The journey continued with questions like that, and I followed Lumi to the reception area, where friendly female staff members were waiting behind the counter.
“If you need any information about the tour itinerary, entrance tickets, or assistance services, please go to the nearest information counter. Our staff will be happy to assist you.”... Suggested options also appear on the screen for visitors to interact with via buttons: “continue the journey,” “I would like to ask a question,” “pause here.”
I waited for the 30-second countdown on the screen before returning to the starting point with Lumi. “Time to visit this area is over,” Lumi said before saying goodbye to me to pick up the next group of visitors. We parted ways after about four minutes together. In total, Lumi and I only stopped three times, ending right before the elevators leading to the second and third floors, where the exhibition rooms are located.
Lumi appears as a modern-day "technological greeting" in a place that holds the space and time of the past: the museum. Therefore, she made a strong impression on me. If only Lumi had a more human-like appearance, more people would have recognized her sooner and stopped to interact with her.
"Knowing why people cry"
Saying goodbye to Lumi reminds me of another farewell scene near the end of the famous movie "Terminator 2: Judgment Day".
A female tourist interacts with the Lumi virtual assistant. Photo: HXH
Many people surely haven't forgotten the scene of the life-or-death battle between two robots – two destroyers sent from the future: T-800 and T-1000. The extremely advanced killing machine T-1000 was sent back in time by the evil artificial intelligence Skynet to assassinate John Connor, the future leader of the human resistance (at that time, John was just a teenager).
T-800, on the other hand, was sent by the resistance to protect John and ensure the future of humanity. When the villainous robot T-1000 fell into a vat of molten steel and melted, T-800's arm and chip were thrown in as well. Sarah, John's mother, said, "That's it." But T-800 pointed to its head: "No. There's still another chip. And it needs to be destroyed too. I can't destroy myself. You have to put me down in that molten steel."
It was a self-destructive decision by the T-800 destroyer. But young John pleaded:
No, no. It'll be alright. Please stay with us!
I have to go. It has to end here.
- I ordered him not to go…
At this point, John broke down in tears, his eyes streaming with them. T-800 slowly pointed to the tears, speaking as if questioning himself: "I know why people cry, something I can't do."
Part 2 of this action- science fiction film was released in 1991, followed by sequels in 2003, 2009, 2015, 2019, and so on. The tears of the Terminator in Part 2 awakened human thought, and humans have continuously researched how to "plant" emotions in robots.
Nearly 30 years later, the phrase "robots that can sense human pain" became widely discussed in the media. This was when Japanese engineer Minoru Asada and his colleagues designed reliable touch sensors for the Affetto robotic system.
Touch signals and physical pain are embedded in the soft, artificial skin, which can be translated into emotional expressions on the face of this robot, modeled after a child's head. This artificial pain nervous system could also allow the robot to empathize with similar sensations experienced by humans.
In late 2024, scientists were also discussing the phenomenon of "skin conductivity," predicting that in the not-too-distant future, robots might be able to detect human emotions simply by touching the skin.
Earlier this year, at the CES 2025 consumer electronics show in the US, Canadian robotics company Realbotix introduced Melody, the most human-like robot ever created, which is integrated with artificial intelligence (AI) and capable of expressing emotions and communicating with humans through voice and eye contact.
*
* *
Being the most human-like still leaves a gap between robots and humans. Of course, that gap is gradually closing. While cinema allows robots to experience emotion in the face of tears, as "Terminator 2: Judgment Day" did, scientists are also creating human-like emotions in robots. Will this strange yet familiar emotion awaken the hearts of those who have become too emotionally numb?
Source: https://baoquangnam.vn/cam-xuc-robot-3153197.html






Comment (0)