Understanding the human brain remains one of science’s greatest challenges. While neuroscience has advanced our knowledge, the computational principles underlying neural networks remain elusive. Artificial neural networks (ANNs), simplified models inspired by biological neurons, offer a promising avenue for exploration. Recent advancements have improved efficiency, reducing reliance on supercomputers to simulate even small brain regions. Developed decades ago, these networks use simplified neuronal representations to perform complex tasks, with deep learning models like CNN-LSTM further enhancing accuracy and speed.
Despite this progress, real neurons are far more complex than their artificial counterparts, processing thousands of inputs through intricate dendritic networks. The brain functions as a dynamic system near chaos, balancing local activity with global coordination—crucial for adaptive behavior. By integrating insights from neuroscience with ANNs, researchers are uncovering new ways to model these dynamics and better understand cognition and behavior.
The brain’s adaptability stems from metastability, where coupled oscillatory processes allow coordination without fixed states. This balance between integration and segregation enhances informational complexity, enabling internal adaptation. For example, Coordination Dynamics (CD) recognizes oscillatory synchronization, akin to pendulum clocks adjusting through “sympathy.” However, current models struggle to capture intricate non-linear dynamics. Brain networks exhibit fractal-like hierarchical structures, where dense intra-modular connections are linked by sparse interconnections. Synthetic hierarchical and modular networks (HMNs) aim to bridge the gap between structure and dynamics, though much remains to be explored.
This Research Topic fosters collaboration between computational scientists and neuroscientists to develop models that better capture the complexity of biological neural systems. While progress has been made, new models are needed to reconcile simplified ANNs with the complex properties of real neurons. Addressing this challenge will deepen our understanding of both natural and artificial neural networks, leading to more accurate models and advances in both fields.
We particularly welcome papers on the following (but not limited to) themes:
- New modelling approaches to address challenges in neural complexity and dynamics
- Bridging neuroscience and artificial neural networks for better models of neural behavior
- Brain networks' and ANN’s modular and hierarchical organization - Fractal-like structures in brain networks and ANNs
- Merging spontaneous actions with directed behaviors (natural vs. artificial neural networks)
- Importance of self-organization and nonlinear dynamics
- Metastability from coupled oscillatory processes - Studying phenomena such as random walks, metastable states, spontaneous neuronal avalanches, and infinite correlations in ANNs
- New machine learning advancements for faster simulations without supercomputers
Article types and fees
This Research Topic accepts the following article types, unless otherwise specified in the Research Topic description:
Brief Research Report
Data Report
Editorial
FAIR² Data
FAIR² DATA Direct Submission
General Commentary
Hypothesis and Theory
Methods
Mini Review
Articles that are accepted for publication by our external editors following rigorous peer review incur a publishing fee charged to Authors, institutions, or funders.
Article types
This Research Topic accepts the following article types, unless otherwise specified in the Research Topic description:
Important note: All contributions to this Research Topic must be within the scope of the section and journal to which they are submitted, as defined in their mission statements. Frontiers reserves the right to guide an out-of-scope manuscript to a more suitable section or journal at any stage of peer review.