Author:
Yuan Jiahong,Cai Xingyu,Bian Yuchen,Ye Zheng,Church Kenneth
Abstract
Pauses, disfluencies and language problems in Alzheimer’s disease can be naturally modeled by fine-tuning Transformer-based pre-trained language models such as BERT and ERNIE. Using this method with pause-encoded transcripts, we achieved 89.6% accuracy on the test set of the ADReSS (Alzheimer’sDementiaRecognition throughSpontaneousSpeech) Challenge. The best accuracy was obtained with ERNIE, plus an encoding of pauses. Robustness is a challenge for large models and small training sets. Ensemble over many runs of BERT/ERNIE fine-tuning reduced variance and improved accuracy. We found thatumwas used much less frequently in Alzheimer’s speech, compared touh. We discussed this interesting finding from linguistic and cognitive perspectives.
Cited by
17 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献