EDITORIAL article

Front. Physiol., 11 May 2022

Sec. Fractal Physiology

Volume 13 - 2022 | https://doi.org/10.3389/fphys.2022.917001

Editorial: Inference, Causality and Control in Networks of Dynamical Systems: Data Science and Modeling Perspectives to Network Physiology With Implications for Artificial Intelligence

  • 1. Ming-Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA, United States

  • 2. Keck Laboratory for Network Physiology, Department of Physics, Boston University, Boston, MA, United States

  • 3. Harvard Medical School and Division of Sleep Medicine, Brigham and Women’s Hospital, Boston, MA, United States

  • 4. Institute of Solid State Physics, Bulgarian Academy of Sciences, Sofia, Bulgaria

  • 5. Delft Center for Systems and Control, Delft University of Technology, Delft, Netherlands

Article metrics

View details

1

Citations

1,7k

Views

796

Downloads

A fundamental problem crisscrossing the fields of physiology and artificial intelligence is understanding how complex activity and behavior emerge from the intrinsic underlying structure and dynamics. To address this problem, we need new methodologies and tools to perform a comprehensive analysis of complex systems dynamics. Multifractal formalism and methodology enable us to investigate local interactions underlying physiological systems and quantify the organization of physiological temporal fluctuations and their cascades across scales (Plamen Ch. Ivanov et al., 1999, 2001; Ivanov et al., 2002; Mukli et al., 2015). Besides, we need a general network framework to examine networks of interactions among diverse subsystems across space and time scales that lead to emergent complex behaviors at the systems level (Bashan et al., 2012; P. C. H. Ivanov et al., 2016; Ivanov and Bartsch, 2014). Despite recent progress in the theory of dynamic networks, there are fundamental methodological and conceptual challenges in understanding how global states and functions emerge in networks of diverse dynamical systems with time-varying interactions and the basic principles of their hierarchical integration. In particular, when mining the time-varying complex networks structure and dynamics, one has to overcome various internal or external perturbations that can transiently or permanently mask the activity of particular nodes and their causal interactions (Gupta et al., 2019; Gupta et al., 2018; Xue and Bogdan, 2017a; Xue and Bogdan, 2019; Xue and Bogdan, 2017b).

Novel artificial intelligence techniques and machine learning algorithms may equip us with the tools to classify and predict the emergent behavior in dynamical networks based simultaneously on network topology and temporal patterns in network dynamics. Key insights and knowledge that emerge from multifractal and differential geometry concepts can help analyze and quantify their complexity. Furthermore, they allow us to determine the most efficient network architecture to generate a given function, quantify key universalities, and identify new theoretical directions for artificial intelligence and machine learning based on physiological principles (Richards et al., 2019). Ultimately, we will attain sustainable systems that enjoy seamlessly indistinguishable features of physiological systems (Wu et al., 2021).

From genomic, proteomic, and metabolic networks to microbial communities, neural systems, and human network physiology of organ systems, complex systems display multi-scale spatiotemporal patterns that are frequently classified as non-linear, non-Gaussian, scale-invariant, and multifractal (Bassingthwaighte et al., 2013; Ivanov et al., 2009; Stanley et al., 1999; West and Zweifel, 1992). While several efforts have demonstrated that electromyographic signals possess fractal properties (Sanders et al., 1996; Xue et al., 2016; Garcia-Retortillo et al., 2020; Rizzo et al., 2020), (Martin del Campo Vera and Jonckheere) report a complex bursting rate variability phenomenon where the surface electromyographic (sEMG) bursts are synchronous with wavelet packets in the D8 sub-band of the Daubechies 3 (db3) wavelet decomposition of the raw signal. Their db3 wavelet decomposition analysis reconstructs the sEMG bursts with two high coefficients at level 8, indicating a high incidence of two consecutive neuronal discharges. In contrast to heart rate variability (P. Ch Ivanov et al., 1998), the newly reported bursting rate variability phenomenon involves a time-localization of the burst with a statistical waveform matching between the “D8 doublet” and the burst in the raw sEMG signal. While this analysis focused on an available small cohort of patients, further comprehensive studies can elucidate the interdependencies between the electromyographic signals and other brain and physiological processes, determine the mechanistic role, and implications for medical applications.

Quests for understanding the inner workings of complex biological dynamics have provided not only more appropriate and efficacious medical therapies but have also led to new artificial intelligence algorithms and architectures. For instance, inspired by early modeling of how biological neurons process information, the reservoir computer model–a type of recurrent neural network where the set of outputs are fit to a training signal–provided promising high-performance low-power consumption computational strategies in classification tasks. Along these lines, (Carroll) considers the computational difficulty of parameter optimization in a reservoir computer and demonstrates that the optimum classification performance occurs for the hyperparameters that maximize the entropy of either a spiking reservoir computer or a reservoir computer. Intriguingly, T. Carroll shows that optimizing for entropy only requires a single realization of each signal to be classified, which provides a fast and low power computational strategy.

Intelligence is an essential trait characteristic of healthy biological systems, allowing them not only to locally optimize in search for better fitness states, but also to cope with unknown rare environmental perturbations. While much of the complex dynamical systems theory focused on defining, quantifying, and analyzing the degree of emergence (Balaban et al., 2018; Koorehdavoudi and Bogdan, 2016), self-organization (Balaban et al., 2018; Koorehdavoudi and Bogdan, 2016; Polani, 2008), self-optimization (Gershenson et al., 2021; Koorehdavoudi and Bogdan, 2016; Prokopenko et al., 2009; Prokopenko et al., 2014) and complexity (Adami, 2002; Jost, 2004; Koorehdavoudi and Bogdan, 2016) of various complex biological systems towards providing a definition of “intelligence” (Hernández-Orallo et al., 2021), generalization–the ability of a system to handle unexpected (future) situations for which it was not trained with a similar degree of success to that small data on which it was trained—remains an essential feature distinguishing human and artificial intelligence. Current efforts in artificial intelligence and machine learning investigate the degree to which a variety of neural network architectures are capable of “generalizing” and exhibiting intelligent behavior. Along these lines, Stoop) provides a series of fundamental examples demonstrating that for situations not included in the training efforts, the AI systems tend to run into substantial problems. These fundamental examples highlight not only the difference between human and artificial intelligence but also call for renewed interest in defining the theoretical foundations of intelligence.

By taking inspiration from biological neural systems capable of solving complex multi-objective problems characterized by ill-conditioned Hessians, Chatterjee et al. pioneer a fractional time series analysis framework that can not only model the neuro-physiological processes but also can circumvent the challenges of current optimization tools. More precisely, they show that the long-range memory observed in many biological systems and neurophysiological signals in particular exhibits non-exponential power-law decay of trajectories that can model the behavior associated with the objective function’s local curvature at a given time point. This allows them to propose the NEuro-inspired Optimization (NEO) method to deal with ill-conditioned Hessian problems. While promising, this effort shows that mathematical approaches targeting understanding the multifractality of biological systems can provide new theoretical directions for artificial intelligence.

The works presented in this Research Topic collection and current advances in the field of fractal and multifractal investigations of physiological systems structure and dynamics, and their applications to artificial intelligence, outline new challenges and opportunities in multidisciplinary research and applications. Dealing with the heterogeneity, multi-modality, and complexity of physiological and artificial systems requires rigorous mathematical and algorithmic techniques to extract causal interdependencies between systems across different scales while overcoming various noise sources. As such, progress in this direction will require new algorithmic strategies to quantify time-varying information flow among diverse physiological and artificial processes across scales and determine how it influences the system dynamics.

Furthermore, there is an urgent need to adopt a cross-scale perspective and a corresponding theoretical framework to investigate the multi-scale regulatory mechanisms underlying the overall network and its relation to emergent states and functions in physiological and artificial systems. This urges the interactions of statistical physics, non-linear dynamics, information theory, probability and stochastic processes, artificial intelligence, machine learning, control theory and optimization, basic physiology, and medicine, such that new theoretical and algorithmic foundations will emerge for analyzing and designing physiological and artificial systems. Only then, the biomedical and engineering communities will be able to develop new control methodologies that do not seek to only enforce a specific reference value but rather ensure that the complexity and multifractality are restored to a desirable profile.

Statements

Author contributions

PB and SP wrote the editorial under the mentorship and feedback from PI.

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

  • 1

    AdamiC. (2002). What Is Complexity?BioEssays24 (12), 10851094. 10.1002/bies.10192

  • 2

    BalabanV.LimS.GuptaG.BoedickerJ.BogdanP. (2018). Quantifying Emergence and Self-Organisation of Enterobacter cloacae Microbial Communities. Sci. Rep.8 (1). 10.1038/s41598-018-30654-9

  • 3

    BashanA.BartschR. P.KantelhardtJ. W.HavlinS.IvanovP. C. (2012). Network Physiology Reveals Relations between Network Topology and Physiological Function. Nat. Commun.3. 10.1038/ncomms1705

  • 4

    BassingthwaighteJ.LiebovitchL.WestB. (2013). “Fractal Physiology,” in Fractal Physiology (Springer).

  • 5

    Garcia-RetortilloS.RizzoR.WangJ. W. J. L.SitgesC.IvanovP. C. (2020). Universal Spectral Profile and Dynamic Evolution of Muscle Activation: A Hallmark of Muscle Type and Physiological State. J. Appl. Physiology129 (3), 419441. 10.1152/japplphysiol.00385.2020

  • 6

    GershensonC.PolaniD.MartiusG. (2021). Editorial: Complexity and Self-Organization. Front. Robot. AI8, 668305. 10.3389/frobt.2021.668305

  • 7

    GuptaG.PequitoS.BogdanP. (2018). Dealing with Unknown Unknowns: Identification and Selection of Minimal Sensing for Fractional Dynamics with Unknown Inputs. Proc. Am. Control Conf., 28142820. 10.23919/ACC.2018.8430866

  • 8

    GuptaG.PequitoS.BogdanP. (2019). Learning Latent Fractional Dynamics with Unknown Unknowns. Proc. Am. Control Conf., 217222. 10.23919/acc.2019.8815074

  • 9

    Hernández-OralloJ.LoeB. S.ChekeL.Martínez-PlumedF.Ó hÉigeartaighS. (2021). General Intelligence Disentangled via a Generality Metric for Natural and Artificial Intelligence. Sci. Rep.11 (1). 10.1038/s41598-021-01997-7

  • 10

    IvanovP. C.AmaralL. A. N.GoldbergerA. L.HavlinS.RosenblumM. G.StruzikZ. R.et al (1999). Multifractality in Human Heartbeat Dynamics. Nature399 (6735), 461465. 10.1038/20924

  • 11

    IvanovP. C.BartschR. P.BartschR. P. (2016). Focus on the Emerging New Fields of Network Physiology and Network Medicine. New J. Phys.18 (10), 100201. 10.1088/1367-2630/18/10/100201

  • 12

    IvanovP. C.BartschR. P. (2014). Network Physiology: Mapping Interactions between Networks of Physiologic Networks. Underst. Complex Syst., 203222. 10.1007/978-3-319-03518-5_10

  • 13

    IvanovP. C.GoldbergerA. L.StanleyH. E. (2002). “Fractal and Multifractal Approaches in Physiology,” in The Science of Disasters (Springer Berlin Heidelberg), 218257. 10.1007/978-3-642-56257-0_7

  • 14

    Ivanov PChRosenblumM. G.PengC. K.MietusJ. E.HavlinS.StanleyH. E.et al (1998). Scaling and Universality in Heart Rate Variability Distributions. Phys. A249 (1–4), 587593. 10.1016/S0378-4371(97)00522-0

  • 15

    IvanovP. C.MaQ. D. Y.BartschR. P.HausdorffJ. M.Nunes AmaralL. A.Schulte-FrohlindeV.et al (2009). Levels of Complexity in Scale-Invariant Neural Signals. Phys. Rev. E79 (4). 10.1103/PhysRevE.79.041920

  • 16

    IvanovP. C.Nunes AmaralL. A.GoldbergerA. L.HavlinS.RosenblumM. G.StanleyH. E.et al (2001). From 1/f Noise to Multifractal Cascades in Heartbeat Dynamics. Chaos11 (3), 641652. 10.1063/1.1395631

  • 17

    JostJ. (2004). External and Internal Complexity of Complex Adaptive Systems. Theory Biosci.123 (1), 6988. 10.1016/j.thbio.2003.10.001

  • 18

    KoorehdavoudiH.BogdanP. (2016). A Statistical Physics Characterization of the Complex Systems Dynamics: Quantifying Complexity from Spatio-Temporal Interactions. Sci. Rep.6. 10.1038/srep27602

  • 19

    MukliP.NagyZ.EkeA. (2015). Multifractal Formalism by Enforcing the Universal Behavior of Scaling Functions. Phys. A Stat. Mech. Its Appl.417, 150167. 10.1016/j.physa.2014.09.002

  • 20

    PolaniD. (2008). “Foundations and Formalizations of Self-Organization,” in Advanced Information and Knowledge Processing (Springer), 1937. 10.1007/978-1-84628-982-8_2

  • 21

    ProkopenkoM.BoschettiF.RyanA. J. (2009). An Information-Theoretic Primer on Complexity, Self-Organization, and Emergence. Complexity15 (1), 1128. 10.1002/cplx.20249

  • 22

    ProkopenkoM.PolaniD.AyN. (2014). On the Cross-Disciplinary Nature of Guided Self-Organisation. 315. 10.1007/978-3-642-53734-9_1

  • 23

    RichardsB. A.LillicrapT. P.BeaudoinP.BengioY.BogaczR.ChristensenA.et al (2019). A Deep Learning Framework for Neuroscience. Nat. Neurosci.22 (11), 17611770. 10.1038/s41593-019-0520-2

  • 24

    RizzoR.ZhangX.WangJ. W. J. L.LombardiF.IvanovP. C. (2020). Network Physiology of Cortico-Muscular Interactions. Front. Physiol.11. 10.3389/fphys.2020.558070

  • 25

    SandersD. B.StålbergE. V.NandedkarS. D. (1996). Analysis of the Electromyographic Interference Pattern. J. Clin. Neurophysiology13 (5), 385400. 10.1097/00004691-199609000-00003

  • 26

    StanleyH. E.AmaralL. A.GoldbergerA. L.HavlinS.Ivanov PChP. C.PengC. K. (1999). Statistical Physics and Physiology: Monofractal and Multifractal Approaches. Phys. A270 (1), 309324. 10.1016/S0378-4371(99)00230-7

  • 27

    WestB. J.ZweifelP. F. (1992). Fractal Physiology and Chaos in Medicine. Phys. Today45 (3), 6870. 10.1063/1.2809583

  • 28

    WuC.-J.RaghavendraR.GuptaU.AcunB.ArdalaniN.MaengK.et al (2021). Sustainable AI: Environmental Implications, Challenges and Opportunities. Available at: https://arxiv.org/abs/2111.00364.

  • 29

    XueY.BogdanP. (2017b). “Constructing Compact Causal Mathematical Models for Complex Dynamics,” in Proceedings - 2017 ACM/IEEE 8th International Conference on Cyber-Physical Systems, 97107. ICCPS 2017 (Part of CPS Week). 10.1145/3055004.3055017

  • 30

    XueY.BogdanP. (2019). Reconstructing Missing Complex Networks against Adversarial Interventions. Nat. Commun.10 (1). 10.1038/s41467-019-09774-x

  • 31

    XueY.BogdanP. (2017a). Reliable Multi-Fractal Characterization of Weighted Complex Networks: Algorithms and Implications. Sci. Rep.7 (1). 10.1038/s41598-017-07209-5

  • 32

    XueY.RodriguezS.BogdanP. (2016). “A Spatio-Temporal Fractal Model for a CPS Approach to Brain-Machine-Body Interfaces,” in Proceedings of the 2016 Design, Automation and Test in Europe Conference and Exhibition, 642647. DATE. 10.3850/9783981537079_0502

Summary

Keywords

fractional-order dynamic models, artificial Inteligence-AI, physiology, multifractal networks, neuro-inspired artificial intelligence

Citation

Bogdan P, Ivanov PC and Pequito S (2022) Editorial: Inference, Causality and Control in Networks of Dynamical Systems: Data Science and Modeling Perspectives to Network Physiology With Implications for Artificial Intelligence. Front. Physiol. 13:917001. doi: 10.3389/fphys.2022.917001

Received

10 April 2022

Accepted

21 April 2022

Published

11 May 2022

Volume

13 - 2022

Edited and reviewed by

Francoise Argoul, Centre National de la Recherche Scientifique (CNRS), France

Updates

Copyright

*Correspondence: Sergio Pequito,

This article was submitted to Fractal Physiology, a section of the journal Frontiers in Physiology

Disclaimer

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

Outline

Cite article

Copy to clipboard


Export citation file


Share article

Article metrics