AUTHOR=Chou Zane Z. , Bouteiller Jean-Marie C. TITLE=Maximizing theoretical and practical storage capacity in single-layer feedforward neural networks JOURNAL=Frontiers in Computational Neuroscience VOLUME=Volume 19 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2025.1646810 DOI=10.3389/fncom.2025.1646810 ISSN=1662-5188 ABSTRACT=Artificial neural networks are limited in the number of patterns that they can store and accurately recall, with capacity constraints arising from factors such as network size, architectural structure, pattern sparsity, and pattern dissimilarity. Exceeding these limits leads to recall errors, eventually leading to catastrophic forgetting, which is a major challenge in continual learning. In this study, we characterize the theoretical maximum memory capacity of single-layer feedforward networks as a function of these parameters. We derive analytical expressions for maximum theoretical memory capacity and introduce a grid-based construction and sub-sampling method for pattern generation that takes advantage of the full storage potential of the network. Our findings indicate that maximum capacity scales as (N/S)S, where N is the number of input/output units and S the pattern sparsity, under threshold constraints related to minimum pattern differentiability. Simulation results validate these theoretical predictions and show that the optimal pattern set can be constructed deterministically for any given network size and pattern sparsity, systematically outperforming random pattern generation in terms of storage capacity. This work offers a foundational framework for maximizing storage efficiency in neural network systems and supports the development of data-efficient, sustainable AI.