Affiliation:
1. Key Laboratory of Noise and Vibration Research Institute of Acoustics Chinese Academy of Sciences Beijing China
2. University of Chinese Academy of Sciences Beijing China
3. Shanghai Institute of AI for Education East China Normal University Shanghai China
4. Department of Electrical Engineering and Computer Science University of Wisconsin–Milwaukee Milwaukee Wisconsin USA
Abstract
AbstractThe head‐related transfer function (HRTF) plays a vital role in immersive virtual reality and augmented reality technologies, especially in spatial audio synthesis for binaural reproduction. This article proposes a deep learning method with generic HRTF amplitudes and anthropometric parameters as input features for individual HRTF generation. By designing fully convolutional neural networks, the key anthropometric parameters and the generic HRTF amplitudes were used to predict each individual HRTF amplitude spectrum in the full‐space directions, and the interaural time delay (ITD) was predicted by the transformer module. In the amplitude prediction model, the attention mechanism was adopted to better capture the relationship of HRTF amplitude spectra at two distinctive directions with large angle differences in space. Finally, with the minimum phase model, the predicted amplitude spectrum and ITDs were used to obtain a set of individual head‐related impulse responses. Besides the separate training of the HRTF amplitude and ITD generation models, their joint training was also considered and evaluated. The root‐mean‐square error and the log‐spectral distortion were selected as objective measurement metrics to evaluate the performance. Subjective experiments further showed that the auditory source localisation performance of the proposed method was better than other methods in most cases.
Publisher
Institution of Engineering and Technology (IET)
Subject
Artificial Intelligence,Computer Networks and Communications,Computer Vision and Pattern Recognition,Human-Computer Interaction,Information Systems
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献