AUTHOR=Deckers Lucas , Tsang Ing Jyh , Van Leekwijck Werner , Latré Steven TITLE=Extended liquid state machines for speech recognition JOURNAL=Frontiers in Neuroscience VOLUME=Volume 16 - 2022 YEAR=2022 URL=https://www.frontiersin.org/journals/neuroscience/articles/10.3389/fnins.2022.1023470 DOI=10.3389/fnins.2022.1023470 ISSN=1662-453X ABSTRACT=A liquid state machine (LSM) is a biologically plausible model of a cortical microcircuit. Essentially, it exists of a random, sparse reservoir of recurrently connected spiking neurons with fixed synapses and a trainable readout layer. The LSM exhibits low training complexity and enables backpropagation-free learning in a powerful, yet simple computing paradigm. In this work, the liquid state machine is enhanced by a set of bio-inspired extensions to create the extended liquid state machine (ELSM). By including excitatory/inhibitory balance, spike-frequency adaptation and neuronal heterogenity, we show that the ELSM consistently improves upon the LSM and can even attain similar performances as the current state of the art in spiking neural networks on a set of speech recognition benchmarks. The ELSM retains the benefits of the straightforward LSM structure and training procedure. The proposed extensions led up to an averaged 5.12% increase in accuracy while decreasing the number of spikes in the ELSM up to 20.21% on our benchmark speech data sets. Furthermore, we illustrate that the ELSM input-liquid and recurrent synaptic weights can be compressed to 4-bit resolution without any significant loss in classification performance. We thus show that the ELSM is a powerful, biologically plausible and hardware-friendly spiking neural network model that attains state-of-the-art accuracy on speech recognition benchmarks for spiking neural networks.