Abstract
While recalling lists of unrelated items is highly challenging we can recall much longer sequences structured as an episode or story. It is unknown why these patterns have such a striking influence on memory. We introduce a model where the experience of an episode or story is represented as a path through a pre-existing network of cognitive states. We demonstrate that by summing the neural representations of the visited states, this path can be transformed into a simple neural code: a path vector. We show how, by leveraging sparse connectivity and high dimensionality, path vectors provide robust codes for a large set of sequences and can be decoded mechanistically for memory retrieval. Fitting our model to data reveals how human free and serial recall may emerge from adapting coding mechanisms tuned for sequences aligned with existing network paths. We thus posit that sequences such as episodes or stories map more directly to existing cognitive network paths than arbitrary lists, with the latter eliciting paths that tend to interfere and impair recall. Our model suggests that mnemonic strategies like imposing narrative structure on a list act to improve recall by reducing such path interference. This work illuminates a simple bioplausible means for flexibly recruiting existing cognitive structures to encode new memories.
Publisher
Cold Spring Harbor Laboratory