Author:
Jiang Yanru,Dale Rick,Lu Hongjing
Subject
Artificial Intelligence,Cognitive Neuroscience,Experimental and Cognitive Psychology,Software
Reference43 articles.
1. Baan, J., Hoeve, M.T., Wees, M.V., Schuth, A., & de Rijke, M. (2019). Do transformer attention heads provide transparency in abstractive summarization? ArXiv, abs/1907.00570.
2. From short-term store to multicomponent working memory: The role of the modal model;Baddeley;Memory & Cognition,2019
3. Dynamical explanation and mental representations;Chemero;Trends in Cognitive Sciences,2001
4. An automated quality evaluation framework of psychotherapy conversations with local quality estimates;Chen;Computer Speech & Language,2022
5. How does the brain learn environmental structure? Ten core principles for understanding the neurocognitive mechanisms of statistical learning;Conway;Neuroscience & Biobehavioral Reviews,2020