
Australia develops AI technology to detect deepfake audio with near-absolute accuracy - Illustration photo: REUTERS
Scientists at the Commonwealth Scientific and Industrial Research Organisation (CSIRO), Federation University Australia and RMIT University have successfully developed a method to detect audio deepfakes with outstanding accuracy and adaptability.
The new technique, called Rehearsal with Auxiliary-Informed Sampling (RAIS), is specifically designed to detect deepfake audio, a growing threat in cybercrime, with risks including bypassing voice biometric authentication systems, impersonation and spreading misinformation, CSIRO said.
RAIS techniques not only determine the authenticity of an audio track, but also ensure high performance is maintained, even as spoofing attacks continue to evolve and change.
Dr Kristen Moore, co-author of the study at Data61 - CSIRO's data and digital unit, shared that the team's goal is to develop a detection system that can update new deepfake samples without having to retrain the model from scratch, avoiding the phenomenon of the model forgetting old data when fine-tuning.
RAIS solves this problem by automatically selecting and storing a small, diverse set of previous deepfakes, including hidden audio features, to help the AI learn new deepfake types while preserving knowledge of old ones, Moore explained.
Specifically, RAIS works based on an intelligent selection process that generates “auxiliary labels” for each audio sample. Combining these auxiliary labels, rather than simply labeling them as “real” or “fake,” ensures a rich and diverse training dataset. This mechanism significantly improves the system’s ability to remember and adapt over time.
According to CSIRO, during testing, RAIS outperformed other methods with an average error rate of 1.95% across five consecutive tests. The source code for this technique has been made available on GitHub - a site specializing in online source code storage based on the Git platform.
Source: https://tuoitre.vn/uc-phat-trien-cong-cu-vach-tran-giong-noi-gia-bang-deepfake-20251112092232468.htm






Comment (0)