Abstract
Modern well-performing approaches to neural decoding are based on machine learning models such as decision tree ensembles and deep neural networks. The wide range of algorithms that can be utilized to learn from neural spike trains, which are essentially time-series data, results in the need for diverse and challenging benchmarks for neural decoding, similar to the ones in the fields of computer vision and natural language processing. In this work, we propose a spike train classification benchmark, based on open-access neural activity datasets and consisting of several learning tasks such as stimulus type classification, animal’s behavioral state prediction, and neuron type identification. We demonstrate that an approach based on hand-crafted time-series feature engineering establishes a strong baseline performing on par with state-of-the-art deep learning-based models for neural decoding. We release the code allowing to reproduce the reported results.
Funder
HSE Basic Research Program
Publisher
Public Library of Science (PLoS)
Subject
Computational Theory and Mathematics,Cellular and Molecular Neuroscience,Genetics,Molecular Biology,Ecology,Modeling and Simulation,Ecology, Evolution, Behavior and Systematics
Reference41 articles.
1. Suite2p: beyond 10,000 neurons with standard two-photon microscopy;M Pachitariu;Biorxiv,2016
2. Tsai D, John E, Chari T, Yuste R, Shepard K. High-channel-count, high-density microelectrode array for closed-loop investigation of neuronal networks. In: Engineering in Medicine and Biology Society (EMBC), 2015 37th Annual International Conference of the IEEE. IEEE; 2015. p. 7510–7513.
3. Challenges and opportunities for large-scale electrophysiology with Neuropixels probes;NA Steinmetz;Current opinion in neurobiology,2018
4. Crcns. org: a repository of high-quality data sets and tools for computational neuroscience;JL Teeters;BMC Neuroscience,2009
5. Glaser JI, Chowdhury RH, Perich MG, Miller LE, Kording KP. Machine learning for neural decoding. arXiv preprint arXiv:170800909. 2017;.
Cited by
3 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献