Abstract
Modern internet has given rise to various voice related crimes worldwide, notably deepfake voice scams where the perpetrators utilize artificial intelligence to deceive victims by the means of forgery of voice. This review article aims to discuss the advancements and challenges in voice biometrics, particularly focusing on the impact of AI and deep learning on this field. It underscores the evolution of voice biometrics from early methods to modern AI enhanced techniques, by highlighting the significant improvements in accuracy, security, and adaptability etc. The key findings of the article have highlighted that while AI-driven advancements have addressed many challenges including voice robustness and multilingual recognition, new threats like deep fake audio require ongoing innovation. The integration of various methods like deep learning, neural networks and advanced feature extraction has shown incredible potential in enhancing the system resilience. But challenges such as voice variability, privacy concerns and the forensic applications of these technologies remain critical issue to be addressed by the future researchers. This review article recommends multidisciplinary research to bridge the gap between this field and forensic science, emphasizing the need for continued development to address and prevent emerging threats very efficiently.
Reference86 articles.
1. G. Ulutas, G. Tahaoglu, and B. Ustubioglu, “Deepfake audio detection with vision transformer based method,” in 2023 46th International Conference on Telecommunications and Signal Processing, TSP 2023, 2023, pp. 244–247. doi: 10.1109/TSP59544.2023.10197715.
2. Times of India, “About 83% Indians have lost money in AI voice scams: Report ,” 2023.
3. Y. Yanagi, R. Orihara, Y. Tahara, Y. Sei, T. Alumäe, and A. Ohsuga, “The Proposal of Countermeasures for DeepFake Voices on Social Media Considering Waveform and Text Embedding,” Annals of Emerging Technologies in Computing, vol. 8, no. 2, pp. 15–31, 2024, doi: 10.33166/AETiC.2024.02.002.
4. Craig Gibson and Josiah Hagen, “Virtual Kidnapping ,” https://www.trendmicro.com/vinfo/us/security/news/cybercrime-and-digital-threats/how-cybercriminals-can-perform-virtual-kidnapping-scams-using-ai-voice-cloning-tools-and-chatgpt.
5. NDTV, “AI Voice Cloning: What It Is And How To Avoid Getting Scammed By It,” Feb. 2024.