Abstract
AbstractIn recent years, multivariate pattern analysis (MVPA) has been hugely beneficial for cognitive neuroscience by making new experiment designs possible and by increasing the inferential power of functional magnetic resonance imaging (fMRI), electroencephalography (EEG), and other neuroimaging methodologies. In a similar time frame, “deep learning” (a term for the use of artificial neural networks with convolutional, recurrent, or similarly sophisticated architectures) has produced a parallel revolution in the field of machine learning and has been employed across a wide variety of applications. Traditional MVPA also uses a form of machine learning, but most commonly with much simpler techniques based on linear calculations; a number of studies have applied deep learning techniques to neuroimaging data, but we believe that those have barely scratched the surface of the potential deep learning holds for the field. In this paper, we provide a brief introduction to deep learning for those new to the technique, explore the logistical pros and cons of using deep learning to analyze neuroimaging data – which we term “deep MVPA,” or dMVPA – and introduce a new software toolbox (the “Deep Learning In Neuroimaging: Exploration, Analysis, Tools, and Education” package, DeLINEATE for short) intended to facilitate dMVPA for neuroscientists (and indeed, scientists more broadly) everywhere.
Publisher
Cold Spring Harbor Laboratory
Reference47 articles.
1. Akama, H. , Murphy, B. , Na, L. , Shimizu, Y. , and Poesio, M. (2012). Decoding semantics across fMRI sessions with different stimulus modalities: a practical MVPA study. Frontiers in neuroinformatics, 6:24. doi: https://doi.org/10.3389/fninf.2012.00024
2. Electrophysiological Studies of Face Perception in Humans
3. Über das elektroenkephalogramm des menschen;Archiv für psychiatrie und nervenkrankheiten,1929
4. Boser, B. E. , Guyon, I. M. , and Vapnik, V. N. (1992). “A training algorithm for optimal margin classifiers” in Proceedings of the Fifth Annual Workshop on Computational Learning Theory. (New York, NY, USA: ACM Press), 144–152.
5. The log-dynamic brain: how skewed distributions affect network operations