AUTHOR=Amit Yali , Walker Jacob TITLE=Recurrent network of perceptrons with three state synapses achieves competitive classification on real inputs JOURNAL=Frontiers in Computational Neuroscience VOLUME=6 YEAR=2012 URL=https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2012.00039 DOI=10.3389/fncom.2012.00039 ISSN=1662-5188 ABSTRACT=

We describe an attractor network of binary perceptrons receiving inputs from a retinotopic visual feature layer. Each class is represented by a random subpopulation of the attractor layer, which is turned on in a supervised manner during learning of the feed forward connections. These are discrete three state synapses and are updated based on a simple field dependent Hebbian rule. For testing, the attractor layer is initialized by the feedforward inputs and then undergoes asynchronous random updating until convergence to a stable state. Classification is indicated by the sub-population that is persistently activated. The contribution of this paper is two-fold. This is the first example of competitive classification rates of real data being achieved through recurrent dynamics in the attractor layer, which is only stable if recurrent inhibition is introduced. Second, we demonstrate that employing three state synapses with feedforward inhibition is essential for achieving the competitive classification rates due to the ability to effectively employ both positive and negative informative features.