Voices of some famous people are used to commentate on the game. Photo: Meta . |
Recently, on social networks, mainly TikTok, a series of videos have appeared recording the screen while playing games, but notably the commentary voices come from famous people. Some artists have dubbed their voices such as Huong Giang, Bich Phuong, Phuong My Chi or TikTokers such as Huy Forum, Tho Nguyen, Jenny Huynh to analyze and comment on the game Lien Quan Mobile, sports ...
However, none of the artists actually voiced the videos , but rather users used AI to mimic their voices. Following instructions from one account, the person used a tool called Minimax to convert text to speech.
Minimax Technology is a Chinese AI company that is famous for its voice cloning capabilities from just a short recording. Minimax, along with other platforms like OpenAI TTS and ElevenLabs, has lowered the technical barrier, allowing anyone to create the voice of anyone.
Minimax is backed by large corporations such as Tencent, Alibaba and many venture capital funds. The company's customers are in many industries, such as gaming, education , finance, entertainment with common purposes such as dubbing video games, commenting on sports highlights, and adding them to comedy videos.
Short clips using celebrity voices have the ability to go viral and easily become a trend. Viewers will feel new and interesting when hearing their favorite celebrities comment on games or say trending phrases.
Most people recognized this as an AI voice, and intentionally commented to praise the celebrity's voice. Some real people like Huong Giang and Huy Forum also expressed their delight when their voices were used humorously and reposted the content.
![]() |
Huong Giang and Huy Forum both reposted AI videos using their voices. Photo: TikTok. |
With just a few available recordings, users just need to fill in the content they want to imitate, and wait a few minutes for a complete version. As a result, many new accounts are created, for example using the syntax of the celebrity name accompanied by the game title, to post such videos. “There is almost no celebrity left who does not play Lien Quan,” one commenter joked.
While it brings entertainment to the online space, some users have exploited the feature for malicious purposes. Deepvoice scams have become increasingly sophisticated, using the voices of loved ones in emergency situations to force instant money transfers.
In Vietnam, a type of scam that has recently become popular is calling parents, informing them that their children have had an accident and need to transfer money immediately for surgery.
Abroad, a company director in the UK was scammed out of more than $240,000 after hearing his "boss" ask him to transfer money over the phone. A Hong Kong finance employee was scammed into transferring $3.3 million after receiving a call from his "director".
In China, Minimax and other voice AI companies often stipulate “no real human voices without permission”, but in reality it is very difficult to control. This technology is quite similar to deepfake, which uses celebrity faces, but only their voices, for illegal advertising, controversial statements, and fake news.
Many international artists such as Drake and The Weeknd have spoken out when the song “AI cover” using their voices went viral. The issue of privacy invasion and spreading fake news using AI is also causing controversy in many countries around the world, aiming to establish a suitable legal framework.
Voice cloning, in addition to dubbing for entertainment videos as above, can also be used for positive purposes such as recreating the voices of deceased relatives and supporting people who have lost their voices. However, anyone whose voice is used should be informed about the incident, in order to effectively control and reduce the high risk of abuse.
Source: https://znews.vn/su-that-ve-clip-phuong-my-chi-huong-giang-binh-luan-game-post1579300.html
Comment (0)