Event Abstract

Sleep as a Monte-Carlo: offline training of grammar-like models of semantic memory in the neocortex

In one view of memory consolidation, information is exchanged between two interacting memory systems: the first, centered around the hippocampus, stores episodes, defined as organized sets of memory items together with their spatial and temporal context, and their interrelationships. A second, semantic memory system, with a strong and distributed role for the neocortex, contains a rich and flexible depiction of the regularities in the world, which could be conveniently described as a statistical generative model of the world.

Training such a generative model is a very hard task, as it targets extremely complex, and largely unpredictable structures. Computational linguistics provides very powerful statistical tools to describe, in a concise and computationally effective way, linguistic information, which is of comparable complexity. Stochastic context-free grammars (SCFG) can be trained from a corpus of utterances through statistical procedures, like the Inside-Outside (IOA) algorithm, which tunes the transition probabilities of a branching process generating all possible parsing trees.
Parsing trees, i.e. hierarchical groupings of items, are a compelling strategy to represent non-linguistic semantic knowledge as well. However, the application such algorithms is limited, in this case, by the lack of a sequential ordering of the items, which is characteristic of linguistic material.

We adapted the IOA so that each episodic memory is expressed as a matrix of associations between items, with this association matrix obviating the need for sequential ordering. Moreover, we show that the algorithm maps into an offline neural dynamics for the neocortex, orchestrated by an input stage (roughly identifiable with the hippocampus) that reactivates the association matrix in terms of the correlations between unit activities. The inside and outside probabilities in the IOA map on, respectively, bottom-up and top-down influences in the hierarchy of cortical areas, and the model can be trained by means of a Hebbian learning rule local to each cortical micro-circuit.

During sleep, the driving force for learning are pairwise correlation patterns in the reactivated neural activity, which convey episode about the content and the structure of each rehearsed episode. This process is modeled on some features of memory replay as it has been experimentally demonstrated in the hippocampus and the neocortex.

The model can reproduce several memory consolidation-related effects, such as the decontextualization of memories and the generation of "insight", i.e. the discovery of hidden, higher order structure.

Conference: Computational and systems neuroscience 2009, Salt Lake City, UT, United States, 26 Feb - 3 Mar, 2009.

Presentation Type: Poster Presentation

Topic: Poster Presentations

Citation: (2009). Sleep as a Monte-Carlo: offline training of grammar-like models of semantic memory in the neocortex. Front. Syst. Neurosci. Conference Abstract: Computational and systems neuroscience 2009. doi: 10.3389/conf.neuro.06.2009.03.009

Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.

The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.

Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.

For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.

Received: 29 Jan 2009; Published Online: 29 Jan 2009.