AUTHOR=Daruwalla Kyle , Lipasti Mikko TITLE=Information bottleneck-based Hebbian learning rule naturally ties working memory and synaptic updates JOURNAL=Frontiers in Computational Neuroscience VOLUME=Volume 18 - 2024 YEAR=2024 URL=https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2024.1240348 DOI=10.3389/fncom.2024.1240348 ISSN=1662-5188 ABSTRACT=Deep neural feedforward networks are effective models for a wide array of problems, but training and deploying such networks presents a significant energy cost (Strubell et al., 2019). Spiking neural networks (SNNs), which are modeled after biologically realistic neurons, offer a potential solution when deployed correctly on neuromorphic computing hardware. Still, many applications train SNNs offline, and running network training directly on neuromorphic hardware is an ongoing research problem. The primary hurdle is that back-propagation, which makes training such artificial deep networks possible, is biologically implausible. Neuroscientists are uncertain about how the brain would propagate a precise error signal backward through a network of neurons.Recent progress (Lillicrap et al., 2014(Lillicrap et al., , 2020) ) addresses part of this question, e.g., the weight transport problem, but a complete solution remains intangible. In contrast, novel learning rules (Ma et al., 2019;Pogodin and Latham, 2020) based on the information bottleneck (IB) train