Affiliation:
1. School of Computer and Communication Engineering University of Science and Technology Beijing Beijing China
2. Department of Neurosurgery Beijing Tiantan Hospital Capital Medical University Beijing China
3. China National Clinical Research Center for Neurological Diseases Beijing China
4. Department of Health Technology and Informatics Hong Kong Polytechnic University Hong Kong SAR China
5. Shunde Innovation School University of Science and Technology Beijing Foshan Guangdong China
Abstract
AbstractOwing to their superior capabilities and advanced achievements, Transformers have gradually attracted attention with regard to understanding complex brain processing mechanisms. This study aims to comprehensively review and discuss the applications of Transformers in brain sciences. First, we present a brief introduction of the critical architecture of Transformers. Then, we overview and analyze their most relevant applications in brain sciences, including brain disease diagnosis, brain age prediction, brain anomaly detection, semantic segmentation, multi‐modal registration, functional Magnetic Resonance Imaging (fMRI) modeling, Electroencephalogram (EEG) processing, and multi‐task collaboration. We organize the model details and open sources for reference and replication. In addition, we discuss the quantitative assessments, model complexity, and optimization of Transformers, which are topics of great concern in the field. Finally, we explore possible future challenges and opportunities, exploiting some concrete and recent cases to provoke discussion and innovation. We hope that this review will stimulate interest in further research on Transformers in the context of brain sciences.
Funder
National Natural Science Foundation of China
Fundamental Research Funds for the Central Universities
China Postdoctoral Science Foundation
Cited by
2 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献