Skip to main content

OPINION article

Front. Neuroanat.
Volume 18 - 2024 | doi: 10.3389/fnana.2024.1374864

Of Artificial Intelligence, Machine learning, and the Human Brain. Celebrating Miklos Palkovits' 90 th Birthday Provisionally Accepted

  • 1Anatomy, Physiology and Genetics, Uniformed Services University of the Health Sciences, United States

The final, formatted version of the article will be published soon.

Receive an email when it is updated
You just subscribed to receive the final version of the article


The promises and challenges of Artificial Intelligence (AI), Machine Learning (ML) and Deep Learning (DL) are based on the premises that we can build machines and write algorithms that will mimic and even surpass the capacity and capabilities of the human brain (Alzubaidi et al., 2021). AI is using Artificial Neuronal Networks (ANN) that intend to mimic the works of the neuronal networks of the human brain. In AI, the strength of connection of each “neuron” to its “neighbor” is a parameter the network calls “weight”. The network starts with random “weights” and adjusts them until the output agrees with the correct answer during the “training" which includes “reading” huge volumes of text in which some words are masked, then “asking” the network to “guess” what those masked words are. Using over 3 BILLION words, the network “learns” what the masked words are (Jain et al., 1996). By comparison, an average child requires 3,000 (!) times LESS words to learn and speak a language (Saenko, 2020). It should be noted however that a child needs much longer time, 4 to 5 years to learn approximately 3,000 words. Regardless, as generative AI, e.g. ChatGPT, Google’s Bard, etc. illustrates, such “brute force” works well for certain brain functions, i.e. storing and analyzing and finding correlations in massive amounts of existing data (Polyportis and Pahos, 2024).
Current AI comes however with caveats. One is the abovementioned inefficiency of ANNs - even a Large Language Model (LLM) - to “learn”. The other one is the currently limited ability of AI for intuition and creativity as compared to the human brain. This is despite the landmark 2016 victory of Google’s AlphaGo that beat the South Korean Go champion, Lee Se-dol (Metz, 2016). A third and critical issue is the enormous energy required by generative AI (de Vries, 2023, Saenko, 2020). Training an ANN, i.e. reading through vast amounts of data until the system “understands it”, needs electricity that can be as much as a small country electricity consumption. Currently, approximately 2% of the TOTAL, GLOBAL electricity production is used by data centers. And this is only the very beginning of AI. With the predicted growth of AI - assessed by counting the annual rate of increase in chip production, e.g. by NVIDIA - electricity demand for AI will increase dramatically. By some estimates, the global electricity demand for AI and related computing can increase by 85 to 134 TWh EVERY YEAR (!). Such an increase in electricity demand is like the country of Sweden would DOUBLE its electricity consumption EVERY YEAR (de Vries, 2023). The effect of such an increase in electricity demand on the “carbon footprint” with the current mix of electric power generation (natural gas: 38%; coal: 22%; nuclear: 19%; renewables: 20%; hydroelectric 6%) can be alarmingly high (Dhar, 2020, Heikkilä, 2023). An example, creating GPT-3 needed 1,287 MWh of electricity that added 552 tons of CO2 - or equivalent - and this is BEFORE any user has started any queries! (Patterson et al., 2021). No surprise that Microsoft has been interested – and has invested - in nuclear power generation, especially in Small Modular Reactors (SMR) that will not increase the carbon footprint (McFadden, 2023). (Microsoft has also invested in Helion – a Sam Altman backed company – that plans to generate electricity using futuristic, nuclear fusion based power (Gardner, 2023).
Compare AIs massive hunger for energy with that of the human brain. While it is hard to calculate the exact energy the human brain needs for its various functions including information processing and analysis, it is clearly only a tiny-tiny fraction that of AI. In 1989 Ralph Merkle has published his study “Energy Limits to the Computational Power of the Human Brain” (Merkle, 1989). He calculated that the human brain is using only ~10 W of energy per second. However, he also estimated that the “computational power” of the human brain is limited to ~1013 to 1016 operations per second. Regardless of the exact energy “consumption” of the human brain per operations which is rather challenging to determine even with magnetic resonance spectroscopy (MRS) and functional magnetic resonance spectroscopy (fMRS) (Rothman et al., 2019, Hyder and Rothman, 2012, Rothman et al., 2011) the notion that the human brain is using a tiny fraction of energy as compared to AI is hard to contest (Hughes, 2023).
A potential cue for such a highly energy efficient “operation” may be that “wiring”, neuronal connectivity is a critical but not the only aspect of how the human brain operates (Gebicke-Haerter, 2023). In contrast to the computers - thus AI’s - binary modus operandi, the human brain is an incredibly complex “machine” using BOTH, analogue and digital modes simultaneously (Marcoli et al., 2022, Guidolin et al., 2022). The seamless integration and utilization of digital and analogue “modes” are likely the “secret” to the unparalleled capacity and abilities of the human brain. Its “operation” is not restricted to binary signaling but a highly sophisticated and complex combination of electrical and chemical signaling within the networks. The dozens of neurotransmitters and neuromodulators along with their receptors, ion channels and intracellular “effectors” make it possible that the human brain is such an incredibly energy efficient “computer”. In addition, neurons can use more than one neurotransmitter (Svensson et al., 2018), integrating various signaling modalities (e.g. (Agoston et al., 1988, Agoston et al., 1994). Knowing what neurotransmitters and neuromodulators the various human brain regions and neuronal pathways are utilizing – the chemical neuroanatomy – is fundamental to our understanding how the human brain operates in health and what are the chemical changes underlying neuropsychiatric disorders (Hokfelt et al., 1984).
This is the field Miklos Palkovits has made enormous contribution. Miklos, along with Tomas Hokfelt, another giant of the field of chemical neuroanatomy (Hokfelt, 2010) along with other giants of neuroscience -Kjell Fuxe (e.g. (Agnati et al., 2011), Harry Steinbusch (Steinbusch, 1981), Larry Swenson (Swanson, 2018), Pasko Rakic (Rakic, 1988), Paul Greengard (Greengard, 2001), Paul E. Sawchenko (Sawchenko, 1998), Larry Swanson (Swanson, 2018) and Clifford B Saper (Saper and Fuller, 2017) to name a few - have majorly contributed to the “chemical mapping” the human brain thus helping us to understand its majesty – and mysteries. (The Handbook of chemical neuroanatomy, Editors Bjorklund and Hokfelt first published in 1983 has reached 22 volumes (Bjorklund and Hokfelt)).
Celebrating Miklos’ 80th Birthday, 10 years ago I wrote a short article entitled: “Great insight created by tiny holes; celebrating 40 years of brain micropunch technique” (Agoston, 2014) that summarized his immense contribution to neuroanatomy - up to December 2013. By 2013, Miklos has published more than 1000 research papers - many of his papers are citation classics, 59 book chapters, 8 books, nominated twice for Nobel price, etc. Ten years later, in December 2023 I had the honor to attend Miklos’ 90th birthday celebration just to learn about his current and -yes- future projects. During the last 10 years, Miklos has published 57 peer reviewed papers, numerous book chapters, reviews and has written and constantly updating his book Practical neurology and neuroanatomy (co-written with Dr. S. Komoly) with the newest neuroimaging and neurophysiology findings.
Miklos’ current research working with collaborators across the globe includes characterization of the human brain (g)lymphatic system (Mezey et al., 2021, Mezey and Palkovits, 2015), identifying SARS-Cov-2 entry sites into the human brain (Vitale-Cross et al., 2022), identifying the role of neuropeptides and their signaling in neuropsychiatric disorders (Barde et al., 2024, Samardžija et al., 2023, Zhong et al., 2022, Hökfelt et al., 2018, Vas et al., 2023, Barde et al., 2016) and neurogenetics (!) (Hardwick et al., 2022, Dóra et al., 2022, Roy et al., 2017).
The last decade of neuroscience research utilizing powerful imaging, electrophysiology, etc. techniques has greatly expanded our knowledge, however we are still far from complete understanding how the human brain works. What is the neurobiological, neuroanatomical and chemical substrates of consciousness, inspiration, intuition? What we do know is that Miklos’ work has been paving the way toward a better understanding of this marvel, the human brain.
Miklós, thank you for teaching and inspiring so many of us, happy (belated) 90th (!) birthday and I am so much looking forward to learning much more from you!

Keywords: Machine Learning (ML), Artificial neuronal network (ANN), neurotransmitters, energy consumption, Efficiency

Received: 22 Jan 2024; Accepted: 25 Mar 2024.

Copyright: © 2024 Agoston. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Prof. Denes V. Agoston, Uniformed Services University of the Health Sciences, Anatomy, Physiology and Genetics, Bethesda, 20814, MD, United States