Updated date: 31/03/2024 05:42:42
According to an OpenAI blog post sharing the results of a small-scale test of the "Voice Engine," the tool can essentially clone someone's voice based on a 15-second audio sample.
Photo: OpenAI
OpenAI has just unveiled a voice-cloning tool, but it will be kept under tight control until safeguards are put in place to prevent audio fakes from misleading listeners.
OpenAI acknowledges that voice generation poses serious risks, especially in an election year. However, the company says it is working with domestic and international partners from government, media, entertainment,education , civil society and other sectors to gather feedback and build a safe tool.
OpenAI says it has implemented a suite of safety measures, including marking the origin of any audio generated by Voice Engine and actively monitoring how the tool is used.
Disinformation researchers are concerned about the risks of massive abuse of AI-powered apps in a crucial election year, as voice-cloning tools are increasingly cheap, easy to use and hard to trace.
One example is an AI-generated call, the brainchild of a lobbyist for Minnesota congressman Dean Phillips, that sounds like President Joe Biden urging people not to vote in the New Hampshire primary in January 2024.
The incident has raised concerns among experts about a wave of AI-generated deepfake (fake audio or video ) misinformation in the 2024 White House race and other major global elections this year.
According to VNA/NDO
Source
Comment (0)