- 1Department of Neurobiology, Graduate School of Medicine, Hokkaido University, Sapporo, Japan
- 2Research Institute of Electrical Communication, Tohoku University, Sendai, Japan
- 3Departament de Física de la Matèria Condensada, Universitat de Barcelona, Barcelona, Spain
- 4Department of Electrical and Electronic Engineering, Graduate School of Engineering, Kobe University, Kobe, Japan
Editorial on the Research Topic
Bridging computation, biophysics, medicine, and engineering in neural circuits
Neural circuits in the brain are highly energy-efficient biological hardware enabling ultra-fast parallel computation that integrates sensory inputs, motor outputs, and internal memory processes. However, the mechanisms underlying energy-efficient computation in biological neural circuits need to be unraveled with a wide range of multidisciplinary approaches. This Research Topic aims to facilitate mutual understanding between the in silico, in vitro, and in vivo neuroscientists with diverse backgrounds. To this end, a wide variety of cutting-edge topics are collected from research in computational neuroscience, biophysics, neurophysiology, material sciences, neuromorphic hardware, bioengineering, and medicine. A unifying theme across these works is the investigation of how structured dynamics and learning rules enable robust information processing, cognition, and recovery from damage in biological neural systems, as well as how these principles can guide the development of computational and neuroengineering models.
Several studies in this Research Topic focus on recurrent network dynamics as a foundation for cortical information processing. Yonemura and Katori extended reservoir computing-based cortical models to account for multi-sensory integration. By combining predictive coding with reliability-weighted error feedback, their model allows a recurrent reservoir to extract temporal context while dynamically adjusting the influence of each sensory modality depending on its noise level. In silico simulations demonstrate that recurrent dynamics can flexibly integrate information across modalities, supporting robust speech recognition. These findings highlight reservoir computing as a biologically plausible and powerful approach for modeling multi-sensory integration in the cortex, where noise and uncertainty are intrinsic features.
At the level of synaptic learning rules, Orima et al. investigated the spatiotemporal learning rule (STLR), which has been proposed to reproduce hippocampal synaptic plasticity. While prior studies mainly evaluated network outputs, Orima et al. emphasized the importance of analyzing synaptic weights themselves to understand the underlying learning dynamics. By mapping synaptic weights into a distance space and applying multiple analytical methods including multidimensional scaling, fractal dimension estimation, and iterated function system modeling, they showed that STLR organizes synaptic weights into a fractal-like structure. This fractal coding suggests that synaptic plasticity rules may generate highly structured, self-similar weight distributions, offering a novel perspective on how complex memory representations can emerge from local learning rules.
Oscillatory dynamics and their functional roles are another major focus of this Research Topic. Masoliver et al. examined hippocampal theta rhythms, in which interneurons are phase-locked while pyramidal neurons exhibit phase precession. Noting the coexistence of synchrony and asynchrony, the authors proposed that this phenomenon resembles chimera states. Using Kuramoto oscillator networks that support chimera dynamics, they modeled the experimentally observed frequency relationship between theta oscillations (~8 Hz) and phase-precession pyramidal neurons (~9 Hz). Moreover, faster oscillators exhibited theta-sequence-like ordering. Importantly, spiking neural networks trained on these chimera dynamics produced subsets of neurons showing phase precession, suggesting that chimera-like dynamics may underlie key temporal coding mechanisms in the hippocampus.
Gamma oscillations, which are closely associated with cognitive functions, were addressed by Li et al., who developed a biologically plausible low-rank spiking neural network based on the voltage-dependent theta neuron model. Through macroscopic analysis, they identified network regimes ranging from stationary firing to gamma oscillations driven by structured connectivity. The model successfully reproduced phase-dependent modulation of neuronal responses in a Go-Nogo task and demonstrated that gamma oscillations can enhance and prolong signal responses. This work provides a mechanistic explanation for how oscillatory dynamics influence cognitive processing and extends low-rank spiking neural network models to capture population-level synchrony.
Predictive representations and reinforcement learning were explored by Tsurumi and Morita through actor–critic neural network models incorporating the successor representation (SR), a predictive state representation thought to be relevant to hippocampal function. By systematically varying whether the actor, the critic, or both components used SR, and comparing these configurations with Q-learning and SARSA, the authors revealed distinct functional roles of SR in different components of the learning architecture. They further showed that combining SR with one-hot state encoding yielded complementary benefits, suggesting that multiple state representations may coexist and cooperate during learning. These findings support the idea that the striatum and related circuits may employ heterogeneous representations to optimize action selection.
Network robustness and recovery after injury were addressed by Sumi et al., who modeled damage and recovery in a modular in vitro neuronal network. Their simulations reproduced experimental observations of reduced spontaneous collective activity following focal lesions and recovery within 24 h. By incorporating spike-timing-dependent plasticity (STDP), the model captured both the decline and restoration of global activity. The results indicate that modular organization, together with synaptic plasticity, enhances network resilience, supports functional recovery, and improves information representation during the post-injury phase.
At the single-neuron and axonal level, Kamiya investigated afterdischarge, prolonged increases in excitability following strong repetitive stimulation, using a computational model of hippocampal mossy fibers with realistic ionic conductances. The study showed that slight depolarization of distal axons combined with high-frequency stimulation can induce robust, long-lasting afterdischarges originating from distal axonal sites and propagating antidromically to the soma. Crucially, replacing inactivating potassium channels with non-inactivating ones abolished afterdischarges, demonstrating that potassium channel inactivation is essential for ectopic spike generation.
Synaptic transmission and input–output characterization were examined by Gattas et al., who applied Volterra expansion to model hippocampal CA3–CA1 synaptic transmission. They found that apical dendritic field excitatory postsynaptic potentials could be described with over 94% accuracy by a second-order model, whereas basal dendrites required higher-order non-linearities. This concise yet accurate mathematical framework offers a scalable approach for building biologically realistic network models that capture dendrite-specific processing.
Neuroengineering approaches were highlighted by Miyahara et al., who developed a microfluidics-based co-culture system of dorsal root ganglion (DRG) and spinal neurons to study spinal pain processing. Using high-density microelectrode arrays and optogenetic stimulation, they demonstrated that activating DRG neurons induced strong and prolonged synchronization in spinal networks, persisting for at least 20 min. This platform provides a valuable tool for investigating mechanisms underlying chronic and nociplastic pain.
Criticality and state-dependent dynamics were explored by Yaghoubi et al. using large-scale calcium imaging of neuronal cultures. By analyzing neuronal avalanche statistics separately during up and down states, they found scale-free behavior in both states when intrinsic time scales were considered. Down-state dynamics closely resembled those of cultures lacking up states, and standard network models failed to reproduce these observations, suggesting the need for more complex, state-dependent self-organization mechanisms.
Additional studies examined autaptic plasticity, developmental synapse properties, brain-state analysis, motor learning, and in vitro human brain modeling. Onda et al. showed that STDP in a minimal autaptic neuron model induces oscillatory weight changes depending on delay and stimulation frequency, potentially contributing to synchronization. Kitaoka et al. revealed positional differences in presynaptically silent synapses within single neurons, with higher proportions at distal sites. Hosaka et al. critically evaluated energy landscape analysis of resting-state fMRI, showing that many inferred state-switching features can arise from stationary linear properties. Tezuka et al. introduced a novel behavioral paradigm for studying simultaneous bimanual movements in mice, and Nishimura et al. reviewed advances in human pluripotent stem cell-derived brain organoids as platforms for modeling human brain development and disease. Finally, Onishi et al. summarized recent developments in activity-dependent gene expression tools that have emerged as powerful methods for linking neuronal activity to behavior by enabling causal analysis, thereby anticipating potential contributions to neural circuit research.
As overviewed above, the studies in this Research Topic underscore the importance of structured dynamics, plasticity, and representation in neural systems and demonstrate how integrative computational, experimental, and engineering approaches can deepen our understanding of brain function and dysfunction. It is also clear that continued progress in elucidating the dynamics and functions of neural circuits will require a sustained combination of cutting edge theoretical, engineering, and biophysical approaches.
Author contributions
HK: Conceptualization, Funding acquisition, Writing – original draft. HY: Conceptualization, Funding acquisition, Writing – review & editing. JS: Conceptualization, Funding acquisition, Writing – review & editing. SS: Conceptualization, Funding acquisition, Writing – review & editing. TO: Conceptualization, Funding acquisition, Writing – review & editing.
Funding
The author(s) declared that financial support was received for this work and/or its publication. This work was supported by the KAKENHI from the JSPS (24H02336 and 25K09842 to HK, 24H02329 and 24H02332 to HY, 25H00447 to SS, 23K21702, and 23KK0184 to TO), and the Cooperative Research Project Program of the Research Institute of Electrical Communication at Tohoku University. JS received support from the European Union Horizon 2020 research and innovation program (Grant No. 964877, project NEU-CHiP), as well as from grant PID2022-137713NB-C22, funded by MCIU/AEI/10.13039/501100011033 and by ERDF/EU, and by the Generalitat de Catalunya under grant 2021-SGR-00450. TO received support from the CREST program (Grant No. JPMJCR24Q5) funded by the Japan Science and Technology Agency.
Conflict of interest
The author(s) declared that this work was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
The author(s) HK, HY, and SS declared that they were an editorial board member of Frontiers, at the time of submission. This had no impact on the peer review process and the final decision.
Generative AI statement
The author(s) declared that generative AI was not used in the creation of this manuscript.
Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.
Publisher's note
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Keywords: biophysics, complex networks, computational neuroscience, information processing, neural circuits, neuromorphic hardware, neurophysiology, signal processing
Citation: Kamiya H, Yamamoto H, Soriano J, Sato S and Omori T (2026) Editorial: Bridging computation, biophysics, medicine, and engineering in neural circuits. Front. Neural Circuits 20:1776224. doi: 10.3389/fncir.2026.1776224
Received: 27 December 2025; Accepted: 02 January 2026;
Published: 21 January 2026.
Edited and reviewed by: Nicolangelo Iannella, University of Oslo, Norway
Copyright © 2026 Kamiya, Yamamoto, Soriano, Sato and Omori. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Haruyuki Kamiya, a2FtaXlhQG1lZC5ob2t1ZGFpLmFjLmpw; Hideaki Yamamoto, aGlkZWFraS55YW1hbW90by5lM0B0b2hva3UuYWMuanA=; Jordi Soriano, am9yZGkuc29yaWFub0B1Yi5lZHU=; Shigeo Sato, c2hpZ2VvQHJpZWMudG9ob2t1LmFjLmpw; Toshiaki Omori, b21vcmlAZWVkZXB0LmtvYmUtdS5hYy5qcA==