AUTHOR=Kobylarz Thaddeus J. A. TITLE=An AI methodology to reduce training intensity, error rates, and size of neural networks JOURNAL=Frontiers in Computational Neuroscience VOLUME=Volume 19 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2025.1628115 DOI=10.3389/fncom.2025.1628115 ISSN=1662-5188 ABSTRACT=Massive computing systems are required to train neural networks. The prodigious amount of consumed energy makes the creation of AI applications significant polluters. Despite the enormous training effort, neural network error rates limit its use for medical applications, because errors can lead to intolerable morbidity and mortality. Two reasons contribute to the excessive training requirements and high error rates; an iterative reinforcement process (tuning) that does not guarantee convergence and the deployment of neuron models only capable of realizing linearly separable switching functions. tuning procedures require tens of thousands of training iterations. In addition, linearly separable neuron models have severely limited capability; which leads to large neural nets. For seven inputs, the ratio of total possible switching functions to linearly separable switching functions is 41 octillion. Addressed here is the creation of neuron models for the application of disease diagnosis. Algorithms are described that perform direct neuron creation. This results in far fewer training steps than that of current AI systems. The design algorithms result in neurons that do not manufacture errors (hallucinations). The algorithms utilize a template to create neuron models that are capable of performing any type of switching function. The algorithms show that a neuron model capable of performing both linearly and nonlinearly separable switching functions is vastly superior to the neuron models currently being used. Included examples illustrate use of the template for determining disease diagnoses (outputs) from symptoms (inputs). The examples show convergence with a single training iteration.