Abstract
AbstractMusic perception engages multiple brain regions, however the neural dynamics of this core human experience remains elusive. We applied predictive models to intracranial EEG data from 29 patients listening to a Pink Floyd song. We investigated the relationship between the song spectrogram and the elicited high-frequency activity (70-150Hz), a marker of local neural activity. Encoding models characterized the spectrotemporal receptive fields (STRFs) of each electrode and decoding models estimated the population-level song representation. Both methods confirmed a crucial role of the right superior temporal gyri (STG) in music perception. A component analysis on STRF coefficients highlighted overlapping neural populations tuned to specific musical elements (vocals, lead guitar, rhythm). An ablation analysis on decoding models revealed the presence of unique musical information concentrated in the right STG and more spatially distributed in the left hemisphere. Lastly, we provided the first song reconstruction decoded from human neural activity.
Publisher
Cold Spring Harbor Laboratory