Author:
Cui Yibo,Qiao Kai,Zhang Chi,Wang Linyuan,Yan Bin,Tong Li
Abstract
Computational visual encoding models play a key role in understanding the stimulus–response characteristics of neuronal populations in the brain visual cortex. However, building such models typically faces challenges in the effective construction of non-linear feature spaces to fit the neuronal responses. In this work, we propose the GaborNet visual encoding (GaborNet-VE) model, a novel end-to-end encoding model for the visual ventral stream. This model comprises a Gabor convolutional layer, two regular convolutional layers, and a fully connected layer. The key design principle for the GaborNet-VE model is to replace regular convolutional kernels in the first convolutional layer with Gabor kernels with learnable parameters. One GaborNet-VE model efficiently and simultaneously encodes all voxels in one region of interest of functional magnetic resonance imaging data. The experimental results show that the proposed model achieves state-of-the-art prediction performance for the primary visual cortex. Moreover, the visualizations demonstrate the regularity of the region of interest fitting to the visual features and the estimated receptive fields. These results suggest that the lightweight region-based GaborNet-VE model based on combining handcrafted and deep learning features exhibits good expressiveness and biological interpretability.
Funder
National Key Research and Development Program of China
Reference50 articles.
1. Spatiotemporal energy models for the perception of motion.;Adelson;J. Optical Soc. Am. A,1985
2. Pixels to voxels: modeling visual representation in the human brain.;Agrawal;arXiv,2014
3. GaborNet: gabor filters with learnable parameters in deep convolutional neural networks.;Alekseev;arXiv,2019
4. Deep convolutional models improve predictions of macaque V1 responses to natural images.;Cadena;PLoS Comput. Biol.,2019
5. Do we know what the early visual system does?;Carandini;J. Neurosci.,2005
Cited by
6 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献