The Tensor Brain: A Unified Theory of Perception, Memory, and Semantic Decoding

Author:

Tresp Volker1,Sharifzadeh Sahand2,Li Hang3,Konopatzki Dario4,Ma Yunpu5

Affiliation:

1. LMU Munich and Siemens Munich, Germany volker.tresp@lmu.de

2. LMU Munich, Germany sahand.sharifzadeh@gmail.com

3. LMU Munich and Siemens Munich, Germany hang.li@campus.lmu.de

4. LMU Munich, Germany dk@dkonopatzki.de

5. LMU Munich and Siemens Munich, Germany cognitive.yunpu@gmail.com

Abstract

Abstract We present a unified computational theory of an agent's perception and memory. In our model, both perception and memory are realized by different operational modes of the oscillating interactions between a symbolic index layer and a subsymbolic representation layer. The two layers form a bilayer tensor network (BTN). The index layer encodes indices for concepts, predicates, and episodic instances. The representation layer broadcasts information and reflects the cognitive brain state; it is our model of what authors have called the “mental canvas” or the “global workspace.” As a bridge between perceptual input and the index layer, the representation layer enables the grounding of indices by their subsymbolic embeddings, which are implemented as connection weights linking both layers. The propagation of activation to earlier perceptual processing layers in the brain can lead to embodiments of indices. Perception and memories first create subsymbolic representations, which are subsequently decoded semantically to produce sequences of activated indices that form symbolic triple statements. The brain is a sampling engine: only activated indices are communicated to the remaining parts of the brain. Triple statements are dynamically embedded in the representation layer and embodied in earlier processing layers: the brain speaks to itself. Although memory appears to be about the past, its main purpose is to support the agent in the present and the future. Recent episodic memory provides the agent with a sense of the here and now. Remote episodic memory retrieves relevant past experiences to provide information about possible future scenarios. This aids the agent in decision making. “Future” episodic memory, based on expected future events, guides planning and action. Semantic memory retrieves specific information, which is not delivered by current perception, and defines priors for future observations. We argue that it is important for the agent to encode individual entities, not just classes and attributes. Perception is learning: episodic memories are constantly being formed, and we demonstrate that a form of self-supervised learning can acquire new concepts and refine existing ones. We test our model on a standard benchmark data set, which we expanded to contain richer representations for attributes, classes, and individuals. Our key hypothesis is that obtaining a better understanding of perception and memory is a crucial prerequisite to comprehending human-level intelligence.

Publisher

MIT Press

Subject

Cognitive Neuroscience,Arts and Humanities (miscellaneous)

Reference156 articles.

1. Pykeen 1.0: A Python library for training and evaluating knowledge graph embeddings;Ali;Journal of Machine Learning Research,2021

2. Bottom-up and top-down attention for image captioning and visual question answering;Anderson,2018

3. Online and off-line memory states in the human brain;Awh,2020

4. In the Theater of Consciousness

5. Cognitive psychology and human memory;Baddeley;Trends in Neurosciences,1988

Cited by 1 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

1. Do DALL-E and Flamingo Understand Each Other?;2023 IEEE/CVF International Conference on Computer Vision (ICCV);2023-10-01

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3