AUTHOR=Zivasatienraj Bill , Doolittle W. Alan TITLE=Dynamical memristive neural networks and associative self-learning architectures using biomimetic devices JOURNAL=Frontiers in Neuroscience VOLUME=17 YEAR=2023 URL=https://www.frontiersin.org/journals/neuroscience/articles/10.3389/fnins.2023.1153183 DOI=10.3389/fnins.2023.1153183 ISSN=1662-453X ABSTRACT=

While there is an abundance of research on neural networks that are “inspired” by the brain, few mimic the critical temporal compute features that allow the brain to efficiently perform complex computations. Even fewer methods emulate the heterogeneity of learning produced by biological neurons. Memory devices, such as memristors, are also investigated for their potential to implement neuronal functions in electronic hardware. However, memristors in computing architectures typically operate as non-volatile memories, either as storage or as the weights in a multiply-and-accumulate function that requires direct access to manipulate memristance via a costly learning algorithm. Hence, the integration of memristors into architectures as time-dependent computational units is studied, starting with the development of a compact and versatile mathematical model that is capable of emulating flux-linkage controlled analog (FLCA) memristors and their unique temporal characteristics. The proposed model, which is validated against experimental FLCA LixNbO2 intercalation devices, is used to create memristive circuits that mimic neuronal behavior such as desensitization, paired-pulse facilitation, and spike-timing-dependent plasticity. The model is used to demonstrate building blocks of biomimetic learning via dynamical memristive circuits that implement biomimetic learning rules in a self-training neural network, with dynamical memristive weights that are capable of associative lifelong learning. Successful training of the dynamical memristive neural network to perform image classification of handwritten digits is shown, including lifelong learning by having the dynamical memristive network relearn different characters in succession. An analog computing architecture that learns to associate input-to-input correlations is also introduced, with examples demonstrating image classification and pattern recognition without convolution. The biomimetic functions shown in this paper result from fully ion-driven memristive circuits devoid of integrating capacitors and thus are instructive for exploiting the immense potential of memristive technology for neuromorphic computation in hardware and allowing a common architecture to be applied to a wide range of learning rules, including STDP, magnitude, frequency, and pulse shape among others, to enable an inorganic implementation of the complex heterogeneity of biological neural systems.