<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="brief-report">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Appl. Math. Stat.</journal-id>
<journal-title>Frontiers in Applied Mathematics and Statistics</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Appl. Math. Stat.</abbrev-journal-title>
<issn pub-type="epub">2297-4687</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fams.2022.879866</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Applied Mathematics and Statistics</subject>
<subj-group>
<subject>Perspective</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Additive Noise-Induced System Evolution (ANISE)</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Hutt</surname> <given-names>Axel</given-names></name>
<xref ref-type="corresp" rid="c001"><sup>&#x0002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/5234/overview"/>
</contrib>
</contrib-group>
<aff><institution>Universit&#x000E9; de Strasbourg</institution>, <addr-line>CNRS</addr-line>, <addr-line>lnria</addr-line>, <addr-line>ICube</addr-line>, <addr-line>MLMS</addr-line>, <addr-line>MIMESIS</addr-line>, <addr-line>Strasbourg</addr-line>, <country>France</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Ulrich Parlitz, Max Planck Society, Germany</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Hil G. E. Meijer, University of Twente, Netherlands; Cristina Masoller, Universitat Politecnica de Catalunya, Spain</p></fn>
<corresp id="c001">&#x0002A;Correspondence: Axel Hutt <email>axel.hutt&#x00040;inria.fr</email></corresp>
<fn fn-type="other" id="fn001"><p>This article was submitted to Frontiers in Applied Mathematics and Statistics, a section of the journal Frontiers in Applied Mathematics and Statistics</p></fn></author-notes>
<pub-date pub-type="epub">
<day>08</day>
<month>04</month>
<year>2022</year>
</pub-date>
<pub-date pub-type="collection">
<year>2022</year>
</pub-date>
<volume>8</volume>
<elocation-id>879866</elocation-id>
<history>
<date date-type="received">
<day>20</day>
<month>02</month>
<year>2022</year>
</date>
<date date-type="accepted">
<day>18</day>
<month>03</month>
<year>2022</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2022 Hutt.</copyright-statement>
<copyright-year>2022</copyright-year>
<copyright-holder>Hutt</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license></permissions>
<abstract>
<p>Additive noise has been known for a long time to not change a systems stability. The discovery of stochastic and coherence resonance in nature and their analytical description has started to change this view in the last decades. The detailed studies of stochastic bifurcations in the last decades have also contributed to change the original view on the role of additive noise. The present work attempts to put these pieces of work in a broader context by proposing the research direction ANISE as a perspective in the research field. ANISE may embrace all studies that demonstrates how additive noise tunes a systems evolution beyond just scaling its magnitude. The article provides two perspective directions of research. The first perspective is the generalization of previous studies on the stationary state stability of a stochastic random network model subjected to additive noise. Here the noise induces novel stationary states. A second perspective is the application of subgrid-scale modeling in stochastic random network model. It is illustrated how numerical parameter estimation complements and extends subgrid-scale modeling and render it more powerful.</p></abstract>
<kwd-group>
<kwd>random network</kwd>
<kwd>subgrid-scale modeling</kwd>
<kwd>mean-field analysis</kwd>
<kwd>particle swarm optimization</kwd>
<kwd>Erd&#x000F6;s and R&#x000E9;nyi</kwd>
</kwd-group>
<counts>
<fig-count count="2"/>
<table-count count="0"/>
<equation-count count="15"/>
<ref-count count="71"/>
<page-count count="7"/>
<word-count count="5401"/>
</counts>
</article-meta>
</front>
<body>
<sec sec-type="intro" id="s1">
<title>Introduction</title>
<p>Noise is a major ingredient in most living and artificial thermodynamically open systems. Essentially it is defined as the contrast to <italic>signal</italic>, that is assumed to be understood or at least known in some detail. Hence the notion of noise is used whenever there is a lack of knowledge on a process, i.e., when it is necessary to describe something unknown or uncontrollable. The relation to chaos and fractals [<xref ref-type="bibr" rid="B1">1</xref>] is interesting, which appear to be very complex features of systems if their dynamics are not known. Consequently, noise effects are considered in models if it is mandatory to describe irregular unknown processes. Such processes may be highly irregular and deterministic or random. The following paragraphs do not distinguish these two cases, but mathematical models assume noise to be random.</p>
<p>Since noise represents an unknown process, typically it is identified as a disturbing element that should be removed or compensated. Observed data are supposed to represent a superposition of signals that carry important information on the system under study and noise whose origin is unrelated to the signal source. Moreover, noise may disturb the control of systems, e.g., in aviation engineering, it may induce difficulties in communication systems or acoustic noise may even represent a serious health hazard in industrial work.</p>
<p>However, noise may also be beneficial to the systems dynamics and thus represents an inevitable ingredient. In engineering, for instance, cochlear implants can improve their signal transmission rate by adding noise and thus save electrical power [<xref ref-type="bibr" rid="B2">2</xref>]. Biomedical wearables can improve their sensitivity by additive noise [<xref ref-type="bibr" rid="B3">3</xref>]. In these applications, noise improves signal transmission by stochastic resonance (SR) [<xref ref-type="bibr" rid="B4">4</xref>]. It is well-known that natural systems employ SR to amplify weak signals and thus ensure information transmission [<xref ref-type="bibr" rid="B5">5</xref>]. We mention on the side Chaotic Resonance [<xref ref-type="bibr" rid="B6">6</xref>], which improves signal transmission by additional chaotic signals and hence demonstrates the similarity between noise and chaos again. Mathematically, systems that exhibit SR have to be driven by a periodic force and additive noise and should exhibit a double-well potential.</p>
<p>Noise is of strong irregular nature and intuition says that additive noise induces irregularity into the stimulated system. Conversely, noise may optimize the systems coherence and thus induce regular behavior. Such an effect is called coherence resonance (CR) [<xref ref-type="bibr" rid="B7">7</xref>, <xref ref-type="bibr" rid="B8">8</xref>] and has been demonstrated in a large number of excitable systems, such as chemical systems [<xref ref-type="bibr" rid="B9">9</xref>], neural systems [<xref ref-type="bibr" rid="B10">10</xref>], nanotubes [<xref ref-type="bibr" rid="B11">11</xref>], semi-conductors [<xref ref-type="bibr" rid="B12">12</xref>], social networks [<xref ref-type="bibr" rid="B13">13</xref>], and the financial market [<xref ref-type="bibr" rid="B14">14</xref>]. To describe CR mathematically, the system performs a noise-induced transition between a quiescent non-oscillatory state and an oscillatory state.</p>
<p>Both SR and CR are prominent examples of mechanisms that show a beneficial noise impact. These mechanisms are mathematically specific and request certain dynamical topologies and combination of stimuli, e.g., the presence of two attractors between which the system is moved by additive noise. Other previous mathematical studies have focussed on more generic additive noise-induced transitions, e.g., at bifurcation points. There is much literature on stochastic bifurcations in low-dimensional systems [<xref ref-type="bibr" rid="B15">15</xref>&#x02013;<xref ref-type="bibr" rid="B18">18</xref>], spatially-extended systems [<xref ref-type="bibr" rid="B19">19</xref>&#x02013;<xref ref-type="bibr" rid="B23">23</xref>], and delayed systems [<xref ref-type="bibr" rid="B24">24</xref>&#x02013;<xref ref-type="bibr" rid="B27">27</xref>].</p>
<p>Beyond SR, CR and stochastic bifurcations, additive noise does not only induce stability transitions between states but may tune the system in the stable regime and hence represents an important system parameter. Such an impact is omnipresent in natural systems while it is less prominent and more difficult to observe. Nevertheless, this stochastic facilitation [<xref ref-type="bibr" rid="B28">28</xref>] results indirectly to fluctuating, probably random, observations. Examples for such observations are a large variability between repeated measurements [<xref ref-type="bibr" rid="B29">29</xref>&#x02013;<xref ref-type="bibr" rid="B31">31</xref>] and strong intrinsic fluctuations in observations [<xref ref-type="bibr" rid="B32">32</xref>&#x02013;<xref ref-type="bibr" rid="B34">34</xref>]. To describe such observed random properties, various different fluctuation mechanisms have been proposed, such as deterministic chaotic dynamics [<xref ref-type="bibr" rid="B35">35</xref>], heterogeneity [<xref ref-type="bibr" rid="B29">29</xref>] or linear high-dimensional dynamics driven by additive noise [<xref ref-type="bibr" rid="B36">36</xref>].</p>
<p>To motivate the focus on additive noise in the present work, let us consider the linear stochastic model</p>
<disp-formula id="E1"><mml:math id="M1"><mml:mtable columnalign="left"><mml:mtr><mml:mtd><mml:mfrac><mml:mrow><mml:mi>d</mml:mi><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mi>d</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:mfrac><mml:mo>=</mml:mo><mml:mo>-</mml:mo><mml:mi>&#x003B3;</mml:mi><mml:mi>x</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>&#x0002B;</mml:mo><mml:mi>&#x003B1;</mml:mi><mml:mi>x</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mi>&#x003BE;</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>&#x0002B;</mml:mo><mml:mi>&#x003B2;</mml:mi><mml:mi>&#x003B7;</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>with &#x003B3;, &#x003B1;, &#x003B2; &#x0003E; 0 and spectral white Gaussian distributed noise &#x003BE;(<italic>t</italic>), &#x003B7;(<italic>t</italic>) both with zero mean and variance <italic>D</italic>. For multiplicative noise only (&#x003B1; &#x0003E; 0, &#x003B2; &#x0003D; 0), the system ensemble average &#x02329;<italic>x</italic>&#x0232A; obeys [<xref ref-type="bibr" rid="B37">37</xref>, <xref ref-type="bibr" rid="B38">38</xref>]</p>
<disp-formula id="E2"><mml:math id="M2"><mml:mtable columnalign="left"><mml:mtr><mml:mtd><mml:mfrac><mml:mrow><mml:mi>d</mml:mi><mml:mrow><mml:mo>&#x02329;</mml:mo><mml:mrow><mml:mi>x</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mo>&#x0232A;</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi>d</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:mfrac><mml:mo>=</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mo>-</mml:mo><mml:mi>&#x003B3;</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mfrac><mml:mrow><mml:msup><mml:mrow><mml:mi>&#x003B1;</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mi>D</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:mfrac></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mo>&#x02329;</mml:mo><mml:mrow><mml:mi>x</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mo>&#x0232A;</mml:mo></mml:mrow><mml:mtext>&#x000A0;</mml:mtext><mml:mo>.</mml:mo></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>This shows that the systems origin is a stationary state, whose stability depends on the multiplicative noise variance <italic>D</italic>. Hence multiplicative noise affects the stability of the system. This is a well-known result [<xref ref-type="bibr" rid="B39">39</xref>, <xref ref-type="bibr" rid="B40">40</xref>]. However, multiplicative noise implies that the noise contribution to the system depends on the system activity. This assumption is strong and can not always be validated. Especially in the lack of knowledge how noise couples to the system under study, this assumption appears to be too strong. Hence it is interesting to take a look at additive noise (&#x003B1; &#x0003D; 0, &#x003B2; &#x0003E; 0) whose noise contribution is independent of the system activity. Then</p>
<disp-formula id="E3"><mml:math id="M3"><mml:mtable columnalign="left"><mml:mtr><mml:mtd><mml:mfrac><mml:mrow><mml:mi>d</mml:mi><mml:mrow><mml:mo>&#x02329;</mml:mo><mml:mrow><mml:mi>x</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mo>&#x0232A;</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi>d</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:mfrac><mml:mo>=</mml:mo><mml:mo>-</mml:mo><mml:mi>&#x003B3;</mml:mi><mml:mrow><mml:mo>&#x02329;</mml:mo><mml:mrow><mml:mi>x</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mo>&#x0232A;</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>demonstrating that zero-mean additive noise does not affect the stability of the systems stationary state in the origin. This is also well-established for linear systems. However, some previous stochastic bifurcation studies on the additive noise effect in nonlinear systems have revealed an induced change of stability of the systems stationary state as mentioned above. Moreover, recent diverse studies of additive noise in oscillatory neural systems have revealed that additive noise may tune the systems principal oscillation frequency [<xref ref-type="bibr" rid="B41">41</xref>&#x02013;<xref ref-type="bibr" rid="B47">47</xref>]. The present work focusses on a certain class of dynamical differential equation models that exhibit Additive Noise-Induced System Evolution (ANISE) and where the additive noise represents a determinant element. Recently, Powanwe and Longtin [<xref ref-type="bibr" rid="B48">48</xref>] have described experimentally observed neural burst activity as an additive noise-controlled process and Powanwe and Longtin [<xref ref-type="bibr" rid="B49">49</xref>] have provided conditions under which two additive noise-driven biological systems share optimally their information. Moreover, previous theoretical neural population studies have demonstrated that additive noise can explain intermittent frequency transitions observed in experimental resting state electroencephalographic data [<xref ref-type="bibr" rid="B50">50</xref>], dynamical switches between two frequency bands induced by opening and closing eyes in humans [<xref ref-type="bibr" rid="B51">51</xref>], and the enhancement of spectral power in the &#x003B3;-frequency under the anesthetic ketamine [<xref ref-type="bibr" rid="B52">52</xref>].</p>
<p>For completeness, it is important to mention quasi-cycle activity [<xref ref-type="bibr" rid="B52">52</xref>&#x02013;<xref ref-type="bibr" rid="B55">55</xref>]. Mathematically, this is the linear response of a deterministically stable system to additive noise below a Hopf bifurcation. Without noise, the system would decay exponentially to the systems stable fixed point as a stable focus, whereas the additive noise <italic>kicks away</italic> the system from the fixed point and thus the system never reaches the fixed point. For long times, the stationary power spectrum of the systems activity is proportional to the noise variance <italic>D</italic>. This linear relation between spectral power and noise variance has been employed extensively in a large number of previous model studies of experimental spectral power distributions, e.g., in the brain [<xref ref-type="bibr" rid="B56">56</xref>, <xref ref-type="bibr" rid="B57">57</xref>]. It is important to point out that the additive noise just scales the global magnitude of the linear systems spectral power distribution, but does not affect selected time scales or frequency bands, e.g., move spectral peaks as observed in the brain [<xref ref-type="bibr" rid="B58">58</xref>]. The subsequent sections illustrate how additive noise may affect nonlinear systems in such a way that the systems intrinsic time scales depend on the noise variance.</p>
<p>The following two sections propose extensions of existing studies in ANISE. The first generalizes previous studies of stochastic dynamics in specific nonlinear random networks indicating a perspective to generalize the analysis of such systems. This brief analysis is followed by the novel proposal to extend the stochastic analysis in ANISE by numerical estimates of subgrid-scale models.</p></sec>
<sec id="s2">
<title>Dynamic Random Network Models</title>
<p>To illustrate a possible perspective in ANISE, let us consider a random network of number of nodes <italic>N</italic>, whose nodes activity obeys</p>
<disp-formula id="E4"><mml:math id="M4"><mml:mtable columnalign="left"><mml:mtr><mml:mtd><mml:mfrac><mml:mrow><mml:mi>d</mml:mi><mml:msub><mml:mrow><mml:mi>V</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi>d</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:mfrac><mml:mo>=</mml:mo><mml:mi>g</mml:mi><mml:mrow><mml:mo stretchy="false">[</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>V</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mo stretchy="false">]</mml:mo></mml:mrow><mml:mo>&#x0002B;</mml:mo><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>j</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:msub><mml:mrow><mml:mi>K</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mi>S</mml:mi><mml:mrow><mml:mo stretchy="false">[</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>V</mml:mi></mml:mrow><mml:mrow><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mo stretchy="false">]</mml:mo></mml:mrow><mml:mo>&#x0002B;</mml:mo><mml:msub><mml:mrow><mml:mi>&#x003BE;</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mtext>&#x000A0;</mml:mtext><mml:mo>,</mml:mo></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>for <italic>i</italic> &#x0003D; 1, &#x02026;, <italic>N</italic>. The additive noise {&#x003BE;<sub><italic>i</italic></sub>(<italic>t</italic>)} is uncorrelated between network nodes and Gaussian distributed with zero mean <inline-formula><mml:math id="M5"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:munderover><mml:msub><mml:mrow><mml:mi>&#x003BE;</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>/</mml:mo><mml:mi>N</mml:mi><mml:mo>=</mml:mo><mml:mn>0</mml:mn></mml:math></inline-formula> and variance <inline-formula><mml:math id="M6"><mml:mi>D</mml:mi><mml:mo>=</mml:mo><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:munderover><mml:msubsup><mml:mrow><mml:mi>&#x003BE;</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>/</mml:mo><mml:mi>N</mml:mi></mml:math></inline-formula> at every time instance <italic>t</italic>. The connectivity matrix <bold>K</bold> is random with non-vanishing mean <inline-formula><mml:math id="M7"><mml:munder class="msub"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow></mml:munder><mml:msub><mml:mrow><mml:mi>K</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>/</mml:mo><mml:mi>N</mml:mi><mml:mo>&#x02260;</mml:mo><mml:mn>0</mml:mn></mml:math></inline-formula>. To gain some insights, at first <inline-formula><mml:math id="M8"><mml:mi>g</mml:mi><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:mi>V</mml:mi></mml:mrow><mml:mo>]</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mo>-</mml:mo><mml:mi>V</mml:mi><mml:mo>,</mml:mo><mml:mi>S</mml:mi><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:mi>V</mml:mi></mml:mrow><mml:mo>]</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mi>V</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:msup><mml:mrow><mml:mi>V</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>&#x0002B;</mml:mo><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub><mml:msup><mml:mrow><mml:mi>V</mml:mi></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msup></mml:math></inline-formula> and the network is of Erd&#x000F6;s-R&#x000E9;nyi-type (ER) [<xref ref-type="bibr" rid="B59">59</xref>], i.e., <italic>K</italic><sub><italic>ij</italic></sub> &#x0003D; 0 with probability 1&#x02212;<italic>c</italic> and <inline-formula><mml:math id="M9"><mml:msub><mml:mrow><mml:mi>K</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mover accent="true"><mml:mrow><mml:mi>K</mml:mi></mml:mrow><mml:mo>&#x00304;</mml:mo></mml:mover><mml:mo>&#x02260;</mml:mo><mml:mn>0</mml:mn></mml:math></inline-formula> with connection probability <italic>c</italic>. In addition, the network is bidirectional with <italic>K</italic><sub><italic>ij</italic></sub> &#x0003D; <italic>K</italic><sub><italic>ji</italic></sub> and we choose <inline-formula><mml:math id="M10"><mml:mover accent="true"><mml:mrow><mml:mi>K</mml:mi></mml:mrow><mml:mo>&#x00304;</mml:mo></mml:mover><mml:mo>=</mml:mo><mml:mn>1</mml:mn><mml:mo>/</mml:mo><mml:mi>N</mml:mi><mml:mi>c</mml:mi></mml:math></inline-formula> for convenience. This choice yields a real-valued matrix spectrum and its maximum eigenvalue is &#x003BB;<sub>1</sub> &#x0003D; 1. As has been shown previously [<xref ref-type="bibr" rid="B60">60</xref>], a Galerkin ansatz assists to understand the dynamics of such a network. We define a bi-orthogonal eigenbasis of <bold>K</bold> with the basis sets {<bold>&#x003A8;</bold><sup>(<italic>k</italic>)</sup>}, {<bold>&#x003A6;</bold><sup>(<italic>k</italic>)</sup>} for which</p>
<disp-formula id="E5"><mml:math id="M11"><mml:mtable columnalign="left"><mml:mtr><mml:mtd><mml:msup><mml:mrow><mml:mstyle mathvariant="bold"><mml:mo>&#x003A8;</mml:mo></mml:mstyle></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>k</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>,</mml:mo><mml:mi>&#x02020;</mml:mi></mml:mrow></mml:msup><mml:msup><mml:mrow><mml:mstyle mathvariant="bold"><mml:mo>&#x003A6;</mml:mo></mml:mstyle></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>l</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:msup><mml:mo>=</mml:mo><mml:msub><mml:mrow><mml:mi>&#x003B4;</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi><mml:mi>l</mml:mi></mml:mrow></mml:msub><mml:mtext>&#x000A0;&#x000A0;&#x000A0;</mml:mtext><mml:mo>,</mml:mo><mml:mtext>&#x000A0;&#x000A0;&#x000A0;</mml:mtext><mml:mi>k</mml:mi><mml:mo>,</mml:mo><mml:mi>l</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn><mml:mo>,</mml:mo><mml:mo>&#x02026;</mml:mo><mml:mo>,</mml:mo><mml:mi>N</mml:mi></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>with the Kronecker symbol &#x003B4;<sub><italic>kl</italic></sub>, where &#x02020; denotes the complex conjugate transposition. The eigenspectrum {&#x003BB;<sub><italic>k</italic></sub>} obeys</p>
<disp-formula id="E6"><mml:math id="M12"><mml:mtable columnalign="left"><mml:mtr><mml:mtd><mml:mstyle mathvariant="bold"><mml:mtext>K</mml:mtext></mml:mstyle><mml:msup><mml:mrow><mml:mstyle mathvariant="bold"><mml:mo>&#x003A6;</mml:mo></mml:mstyle></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>l</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:msup><mml:mo>=</mml:mo><mml:msub><mml:mrow><mml:mo>&#x003BB;</mml:mo></mml:mrow><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub><mml:msup><mml:mrow><mml:mstyle mathvariant="bold"><mml:mo>&#x003A6;</mml:mo></mml:mstyle></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>l</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:msup><mml:mtext>&#x000A0;&#x000A0;&#x000A0;</mml:mtext><mml:mo>,</mml:mo><mml:mtext>&#x000A0;&#x000A0;&#x000A0;</mml:mtext><mml:msup><mml:mrow><mml:mstyle mathvariant="bold"><mml:mo>&#x003A8;</mml:mo></mml:mstyle></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>l</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>,</mml:mo><mml:mi>&#x02020;</mml:mi></mml:mrow></mml:msup><mml:mstyle mathvariant="bold"><mml:mtext>K</mml:mtext></mml:mstyle><mml:mo>=</mml:mo><mml:msub><mml:mrow><mml:mo>&#x003BB;</mml:mo></mml:mrow><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub><mml:msup><mml:mrow><mml:mstyle mathvariant="bold"><mml:mo>&#x003A8;</mml:mo></mml:mstyle></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>l</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>,</mml:mo><mml:mi>&#x02020;</mml:mi></mml:mrow></mml:msup><mml:mtext>&#x000A0;</mml:mtext><mml:mo>.</mml:mo></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>It is well-known that the eigenspectrum of symmetric random matrices has an edge distribution and a bulk distribution of eigenvalues [<xref ref-type="bibr" rid="B61">61</xref>]. For large ER networks with <italic>N</italic> &#x02192; &#x0221E;, both distributions are well-separated. The edge distribution consists of the maximum eigenvalue &#x003BB;<sub>1</sub> with <inline-formula><mml:math id="M13"><mml:msubsup><mml:mrow><mml:mstyle mathvariant="bold"><mml:mtext>&#x003A8;</mml:mtext></mml:mstyle></mml:mrow><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:msubsup><mml:mo>=</mml:mo><mml:mn>1</mml:mn><mml:mo>/</mml:mo><mml:mi>N</mml:mi><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:msubsup><mml:mrow><mml:mstyle mathvariant="bold"><mml:mo>&#x003A6;</mml:mo></mml:mstyle></mml:mrow><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:msubsup><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:math></inline-formula> and the bulk distribution obeys the circular law and thus shrinks with &#x003BB;<sub><italic>k</italic>&#x0003E;1</sub> &#x02192; 0 for <italic>N</italic> &#x02192; &#x0221E;. Assuming the composition <inline-formula><mml:math id="M14"><mml:msub><mml:mrow><mml:mi>V</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>k</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:munderover><mml:msubsup><mml:mrow><mml:mstyle mathvariant="bold"><mml:mo>&#x003A6;</mml:mo></mml:mstyle></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>k</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:msubsup><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:math></inline-formula> with time-dependent mode amplitudes <italic>x</italic><sub><italic>k</italic></sub>(<italic>t</italic>) and projecting the network activity {<italic>V</italic><sub><italic>i</italic></sub>} onto the basis {<bold>&#x003A8;</bold><sup>(<italic>k</italic>)</sup>}, the mode amplitudes {<italic>x</italic><sub><italic>k</italic></sub>(<italic>t</italic>)} obey</p>
<disp-formula id="E7"><label>(1)</label><mml:math id="M15"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mfrac><mml:mrow><mml:mi>d</mml:mi><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>d</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:mfrac><mml:mo>=</mml:mo><mml:mo>-</mml:mo><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>&#x0002B;</mml:mo><mml:mfrac><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:mfrac><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>j</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:msub><mml:mrow><mml:mi>V</mml:mi></mml:mrow><mml:mrow><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>&#x0002B;</mml:mo><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:msubsup><mml:mrow><mml:mi>V</mml:mi></mml:mrow><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup><mml:mo>&#x0002B;</mml:mo><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub><mml:msubsup><mml:mrow><mml:mi>V</mml:mi></mml:mrow><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msubsup></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mtext>&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;</mml:mtext><mml:mo>&#x0002B;</mml:mo><mml:mfrac><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:mfrac><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:msub><mml:mrow><mml:mi>&#x003BE;</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<disp-formula id="E9"><label>(2)</label><mml:math id="M17"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mfrac><mml:mrow><mml:mi>d</mml:mi><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>d</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:mfrac><mml:mo>&#x02248;</mml:mo><mml:mo>-</mml:mo><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>&#x0002B;</mml:mo><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:msubsup><mml:mrow><mml:mtext>&#x003A8;</mml:mtext></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>k</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:msubsup><mml:msub><mml:mrow><mml:mi>&#x003BE;</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mtext>&#x000A0;</mml:mtext><mml:mo>,</mml:mo><mml:mi>k</mml:mi><mml:mo>=</mml:mo><mml:mn>2</mml:mn><mml:mo>,</mml:mo><mml:mo>&#x02026;</mml:mo><mml:mo>,</mml:mo><mml:mi>N</mml:mi></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>Then <italic>V</italic><sub><italic>i</italic></sub>(<italic>t</italic>) &#x0003D; <italic>x</italic><sub>1</sub>(<italic>t</italic>) &#x0002B; &#x003B7;<sub><italic>i</italic></sub>(<italic>t</italic>) with the Ornstein-Uhlenbeck noise process <inline-formula><mml:math id="M18"><mml:msub><mml:mrow><mml:mi>&#x003B7;</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>k</mml:mi><mml:mo>=</mml:mo><mml:mn>2</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:munderover><mml:msubsup><mml:mrow><mml:mstyle mathvariant="bold"><mml:mo>&#x003A6;</mml:mo></mml:mstyle></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>k</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:msubsup><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> that is Gaussian distributed with <inline-formula><mml:math id="M19"><mml:mrow><mml:mi mathvariant="-tex-caligraphic">N</mml:mi></mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mn>0</mml:mn><mml:mo>,</mml:mo><mml:mi>D</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:math></inline-formula> [<xref ref-type="bibr" rid="B60">60</xref>]. Consequently, <italic>x</italic><sub>1</sub> describes the mean-field dynamics of the network. Hence, at each node <italic>i</italic>, <italic>V</italic><sub><italic>i</italic></sub> exhibits a superposition of mode <italic>x</italic><sub>1</sub> and zero-mean fluctuations &#x003B7;<sub><italic>i</italic></sub>. For <italic>N</italic> &#x02192; &#x0221E; the mode amplitude <italic>x</italic><sub>1</sub> obeys</p>
<disp-formula id="E10"><label>(3)</label><mml:math id="M20"><mml:mtable columnalign="left"><mml:mtr><mml:mtd><mml:mfrac><mml:mrow><mml:mi>d</mml:mi><mml:msub><mml:mi>x</mml:mi><mml:mn>1</mml:mn></mml:msub></mml:mrow><mml:mrow><mml:mi>d</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:mfrac><mml:mo>=</mml:mo><mml:mo>&#x02212;</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mo>+</mml:mo><mml:mstyle displaystyle="true"><mml:mrow><mml:msubsup><mml:mo>&#x0222B;</mml:mo><mml:mrow><mml:mo>&#x02212;</mml:mo><mml:mi>&#x0221E;</mml:mi></mml:mrow><mml:mi>&#x0221E;</mml:mi></mml:msubsup><mml:mrow><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:msub><mml:mi>&#x003B3;</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msub><mml:mi>x</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mo>+</mml:mo><mml:mi>w</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>&#x003B3;</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:msup><mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msub><mml:mi>x</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mo>+</mml:mo><mml:mi>w</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mn>2</mml:mn></mml:msup></mml:mrow></mml:mrow></mml:mrow></mml:mrow></mml:mstyle></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mtext>&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;</mml:mtext><mml:mrow><mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>&#x003B3;</mml:mi><mml:mn>3</mml:mn></mml:msub><mml:msup><mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msub><mml:mi>x</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mo>+</mml:mo><mml:mi>w</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mn>3</mml:mn></mml:msup></mml:mrow><mml:mo>]</mml:mo></mml:mrow><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mi>o</mml:mi><mml:mi>u</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mi>w</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mi>d</mml:mi><mml:mi>w</mml:mi></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<disp-formula id="E12"><label>(4)</label><mml:math id="M22"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mo>=</mml:mo><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:mi>D</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>&#x0002B;</mml:mo><mml:mn>3</mml:mn><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub><mml:mi>D</mml:mi><mml:mo>-</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>&#x0002B;</mml:mo><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:msubsup><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mtext>&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;</mml:mtext><mml:mo>&#x0002B;</mml:mo><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub><mml:msubsup><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msubsup></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>with the Gaussian probability distribution of the Ornstein-Uhlenbeck process <inline-formula><mml:math id="M24"><mml:msub><mml:mrow><mml:mi>p</mml:mi></mml:mrow><mml:mrow><mml:mi>o</mml:mi><mml:mi>u</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mrow><mml:mi mathvariant="-tex-caligraphic">N</mml:mi></mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mn>0</mml:mn><mml:mo>,</mml:mo><mml:mi>D</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:math></inline-formula>. The mode <italic>x</italic><sub>1</sub> is deterministic and the additive noise &#x003BE;<sub><italic>i</italic></sub>, that drives the network at each node, affects the mean-field network activity tuning the stability of the stationary state <italic>x</italic><sub>1</sub> &#x0003D; 0 of the network. Moreover, the systems time scale, which is determined by the linear factor in Equation (4), now depends on the noise variance <italic>D</italic>.</p>
<p><xref ref-type="fig" rid="F1">Figure 1A</xref> illustrates the assumptions made. For small networks, the eigenvalue spectrum of the random matrix exhibits a clear gap between the edge spectrum (&#x003BB;<sub>1</sub> &#x0003D; 1) and the bulk spectrum with &#x003BB;<sub><italic>k</italic>&#x0003E;1</sub> &#x02248; 0. This spectral gap increases with increasing <italic>N</italic>. In addition, for small networks the conversion of the sum in (1) to the integral in (3) is a bad approximation, the sum in (1) exhibits strong stochastic fluctuations and hence the network mean fluctuates as well. The larger <italic>N</italic>, the better is the approximation of the sum by the integral in (3) and the more the dynamics resemble the deterministic mean-field dynamics. <xref ref-type="fig" rid="F1">Figure 1B</xref> presents the systems corresponding potential</p>
<disp-formula id="E14"><mml:math id="M25"><mml:mtable columnalign="left"><mml:mtr><mml:mtd><mml:mi>&#x003D5;</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:mi>D</mml:mi><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>&#x0002B;</mml:mo><mml:mfrac><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>&#x0002B;</mml:mo><mml:mn>3</mml:mn><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub><mml:mi>D</mml:mi><mml:mo>-</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:msubsup><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mtext>&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;</mml:mtext><mml:mo>&#x0002B;</mml:mo><mml:mfrac><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:mfrac><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:msubsup><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msubsup><mml:mo>&#x0002B;</mml:mo><mml:mfrac><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>4</mml:mn></mml:mrow></mml:mfrac><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub><mml:msubsup><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>4</mml:mn></mml:mrow></mml:msubsup></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>with <italic>dx</italic><sub>1</sub>/<italic>dt</italic> &#x0003D; &#x02212;<italic>d&#x003D5;</italic>(<italic>x</italic><sub>1</sub>)/<italic>dx</italic><sub>1</sub>. For increasing noise variance <italic>D</italic>, the additive noise merges the stable fixed point (local minimum of &#x003D5; at <italic>V</italic> &#x02260; 0) and the unstable fixed point (local maximum of &#x003D5; at <italic>V</italic> &#x0003D; 0) yielding finally a single stable fixed point (global minimum of &#x003D5;).</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption><p>Mean-field description of ANISE. <bold>(A)</bold> The spectral distribution of random matrix K for different number of network nodes <italic>N</italic> (left column) and corresponding network mean (solid line, from Equation 1) and mean-field (dashed line, from Equation 4) for comparison (right column). Parameters are <italic>K</italic><sub>0</sub> &#x0003D; 1.0, <italic>D</italic> &#x0003D; 0.1, &#x003B3;<sub>1</sub> &#x0003D; 2.0, &#x003B3;<sub>2</sub> &#x0003D; 0.0, &#x003B3;<sub>3</sub> &#x0003D; &#x02212;1.0, &#x003B1; &#x0003D; &#x003B2; &#x0003D; 0.0, <italic>c</italic> &#x0003D; 0.9 and numerical integration time step &#x00394;<italic>t</italic> &#x0003D; 0.03 utilizing the Euler-Maruyama integration method and identical initial values <italic>V</italic><sub><italic>i</italic></sub> &#x0003D; 1.1 at <italic>t</italic> &#x0003D; 0 for all parameters. The panels show results for a single network realization, while the variance of results for multiple network realizations is found to be negligible for <italic>N</italic> &#x02265; 1, 000 (data not shown). <bold>(B)</bold> Potential &#x003D5;(<italic>V</italic>) for different noise variances <italic>D</italic>.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fams-08-879866-g0001.tif"/>
</fig>
<p>Although this example illustrates how additive noise induces a stability transition, the underlying network model is too simple to correspond to natural networks. It assumes a large ER-type network that implies a clear separation between edge and bulk spectrum which in turn reflects a sharp unimodal degree distribution. If this spectral gap is not present, then the mean-field activity <italic>x</italic><sub>1</sub> impacts on the noise process <italic>x</italic><sub><italic>k</italic>&#x0003E;1</sub> which renders the analysis much more complex (closure problem). Such a case occurs in most natural networks [<xref ref-type="bibr" rid="B62">62</xref>], such as scale-free [<xref ref-type="bibr" rid="B63">63</xref>] or small-world networks [<xref ref-type="bibr" rid="B64">64</xref>]. Moreover, realistic networks exhibit nonlinear local dynamics <italic>g</italic>(&#x000B7;) which renders a Galerkin approach as shown above much more complex since it leads to a closure problem as well. As a perspective, new methods have to be developed to treat such cases and to reveal whether ANISE represents the underlying mechanism. In this context, subgrid-scale modeling may be a promising approach as outlined in the subsequent section.</p></sec>
<sec id="s3">
<title>Subgrid-Scale Modeling (SGS)</title>
<p>Most natural systems evolve on multiple spatial and/or temporal scales. Examples are biological systems [<xref ref-type="bibr" rid="B65">65</xref>], such as the brain or body tissue, or the earth atmosphere. The latter may exhibit turbulent dynamics whose dynamical details are typically described by the Navier-Stokes equation [<xref ref-type="bibr" rid="B66">66</xref>]. The closure problem tells that large scales determine the dynamics of small scales and vice versa. In general, there is rarely a detailed model description of the dynamics on all system scales and the corresponding numerical simulation of all scales is costly. To this end, subgrid-scale modeling [<xref ref-type="bibr" rid="B67">67</xref>] chooses a certain model description level and provides a model that captures the effective contribution of smaller scales to the dynamics on the chosen level. The present work proposes, as a perspective, to apply SGS in random network models (cf. previous section) and estimate the subgrid-scale dynamics numerically from full-scale simulations.</p>
<p>For illustration, let us re-consider the example in the previous section. It shows how noise &#x003BE;<sub><italic>i</italic></sub> on the microscopic scale, i.e., at each node, impacts the evolution of the mesoscopic scale, i.e., spatial mean or mean field. The Galerkin ansatz is successful in the given case for a large family of nonlinear coupling functions <italic>S</italic>[&#x000B7;] and linear local functions. However, typically complex network systems exhibit local nonlinear dynamics. For illustration reasons, in the following it is <italic>g</italic>[<italic>V</italic>] &#x0003D; &#x02212;<italic>V</italic> &#x0002B; &#x003B2;<italic>V</italic><sup>3</sup> and projections onto the eigenbasis of the random matrix yield</p>
<disp-formula id="E15"><label>(5)</label><mml:math id="M26"><mml:mtable columnalign="left"><mml:mtr><mml:mtd><mml:mfrac><mml:mrow><mml:mi>d</mml:mi><mml:msub><mml:mi>x</mml:mi><mml:mn>1</mml:mn></mml:msub></mml:mrow><mml:mrow><mml:mi>d</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:mfrac><mml:mo>=</mml:mo><mml:mo>&#x02212;</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mo>+</mml:mo><mml:mfrac><mml:mi>&#x003B2;</mml:mi><mml:mi>N</mml:mi></mml:mfrac><mml:mstyle displaystyle="true"><mml:munderover><mml:mo>&#x02211;</mml:mo><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>N</mml:mi></mml:munderover><mml:mrow><mml:msubsup><mml:mi>V</mml:mi><mml:mi>i</mml:mi><mml:mn>3</mml:mn></mml:msubsup></mml:mrow></mml:mstyle><mml:mo>+</mml:mo><mml:mfrac><mml:mn>1</mml:mn><mml:mi>N</mml:mi></mml:mfrac><mml:mstyle displaystyle="true"><mml:munderover><mml:mo>&#x02211;</mml:mo><mml:mrow><mml:mi>j</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>N</mml:mi></mml:munderover><mml:mo stretchy="false">(</mml:mo></mml:mstyle><mml:msub><mml:mi>&#x003B3;</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:msub><mml:mi>V</mml:mi><mml:mi>j</mml:mi></mml:msub></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mtext>&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;</mml:mtext><mml:mo>+</mml:mo><mml:msub><mml:mi>&#x003B3;</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:msubsup><mml:mi>V</mml:mi><mml:mi>j</mml:mi><mml:mn>2</mml:mn></mml:msubsup><mml:mo>+</mml:mo><mml:msub><mml:mi>&#x003B3;</mml:mi><mml:mn>3</mml:mn></mml:msub><mml:msubsup><mml:mi>V</mml:mi><mml:mi>j</mml:mi><mml:mn>3</mml:mn></mml:msubsup><mml:mo stretchy="false">)</mml:mo><mml:mo>+</mml:mo><mml:mfrac><mml:mn>1</mml:mn><mml:mi>N</mml:mi></mml:mfrac><mml:mstyle displaystyle="true"><mml:munderover><mml:mo>&#x02211;</mml:mo><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>N</mml:mi></mml:munderover><mml:mrow><mml:msub><mml:mi>&#x003BE;</mml:mi><mml:mi>i</mml:mi></mml:msub></mml:mrow></mml:mstyle></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<disp-formula id="E17"><label>(6)</label><mml:math id="M28"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mfrac><mml:mrow><mml:mi>d</mml:mi><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>d</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:mfrac><mml:mo>=</mml:mo><mml:mo>-</mml:mo><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>&#x0002B;</mml:mo><mml:mi>&#x003B2;</mml:mi><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:msubsup><mml:mrow><mml:mtext>&#x003A8;</mml:mtext></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>k</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:msubsup><mml:msubsup><mml:mrow><mml:mi>V</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msubsup><mml:mo>&#x0002B;</mml:mo><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:msubsup><mml:mrow><mml:mtext>&#x003A8;</mml:mtext></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>k</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:msubsup><mml:msub><mml:mrow><mml:mi>&#x003BE;</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>for <italic>k</italic> &#x0003D; 2, &#x02026;, <italic>N</italic>. It is still <italic>V</italic><sub><italic>i</italic></sub> &#x0003D; <italic>x</italic><sub>1</sub> &#x0002B; &#x003B7;<sub><italic>i</italic></sub> with <inline-formula><mml:math id="M29"><mml:msub><mml:mrow><mml:mi>&#x003B7;</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>k</mml:mi><mml:mo>=</mml:mo><mml:mn>2</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:munderover><mml:msubsup><mml:mrow><mml:mstyle mathvariant="bold"><mml:mo>&#x003A6;</mml:mo></mml:mstyle></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>k</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:msubsup><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> as in the previous section, but now the noise term &#x003B7;<sub><italic>i</italic></sub> is no Ornstein-Uhlenbeck process anymore due to the nonlinearity in Equation (6). Since the basis {<bold>&#x003A8;</bold><sup>(<italic>k</italic>)</sup>} is not known analytically, it is very difficult to gain the stationary probability density function <italic>p</italic>(&#x000B7;) of &#x003B7;<sub><italic>i</italic></sub>.</p>
<p>To address this problem, taking a close look at <italic>V</italic><sub><italic>i</italic></sub> &#x0003D; <italic>x</italic><sub>1</sub> &#x0002B; &#x003B7;<sub><italic>i</italic></sub>, <italic>x</italic><sub>1</sub> may appear as the mesoscopic activity of the network and &#x003B7;<sub><italic>i</italic></sub> may be interpreted as microscopic activity. This new interpretation is motivated by the previous section and stipulates <inline-formula><mml:math id="M30"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:munderover><mml:msub><mml:mrow><mml:mi>&#x003B7;</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mn>0</mml:mn></mml:math></inline-formula>. Hence, the additional nonlinear local interaction affects the mesoscopic scale dynamics by modulating the microscopic scale dynamics. Now a new subgrid-scale model ansatz assumes that the additional nonlinear local interaction affects the variance of the microscopic scale but retains the microscopic Gaussian distribution shape, i.e., <inline-formula><mml:math id="M31"><mml:mi>&#x003B7;</mml:mi><mml:mo>&#x0007E;</mml:mo><mml:mrow><mml:mi mathvariant="-tex-caligraphic">N</mml:mi></mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mn>0</mml:mn><mml:mo>,</mml:mo><mml:mi>b</mml:mi><mml:mi>D</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:math></inline-formula>. Here the additional factor <italic>b</italic> &#x0003E; 0 captures the impact of the additional nonlinear term <inline-formula><mml:math id="M32"><mml:mi>&#x003B2;</mml:mi><mml:msubsup><mml:mrow><mml:mi>V</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msubsup></mml:math></inline-formula> in <italic>g</italic>[<italic>V</italic>]. It represents the subgrid-scale model parameter for the impact of the nonlinear term. For clarification, <italic>b</italic> &#x0003D; 1 corresponds to &#x003B2; &#x0003D; 0. Inserting this ansatz and <italic>V</italic><sub><italic>i</italic></sub> &#x0003D; <italic>x</italic><sub>1</sub> &#x0002B; &#x003B7;<sub><italic>i</italic></sub> into Equation (5) and taking the limit <italic>N</italic> &#x02192; &#x0221E;</p>
<disp-formula id="E18"><label>(7)</label><mml:math id="M33"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mfrac><mml:mrow><mml:mi>d</mml:mi><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>d</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:mfrac><mml:mo>=</mml:mo><mml:mi>F</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:mi>b</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mtext>&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;</mml:mtext><mml:mo>=</mml:mo><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:mi>D</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>&#x0002B;</mml:mo><mml:mn>3</mml:mn><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub><mml:mi>D</mml:mi><mml:mo>-</mml:mo><mml:mn>1</mml:mn><mml:mo>&#x0002B;</mml:mo><mml:mn>3</mml:mn><mml:mi>&#x003B2;</mml:mi><mml:mi>b</mml:mi><mml:mi>D</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mtext>&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;</mml:mtext><mml:mo>&#x0002B;</mml:mo><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:msubsup><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup><mml:mo>&#x0002B;</mml:mo><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>&#x003B3;</mml:mi></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub><mml:mo>&#x0002B;</mml:mo><mml:mi>&#x003B2;</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:msubsup><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msubsup></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>with a still unknown SGS-factor <italic>b</italic>. Now let us fit this unknown factor numerically. To this end, numerical simulations of the random network (1) permits to compute the time-dependent network mean <italic>V</italic>(<italic>t</italic><sub><italic>n</italic></sub>) with discrete time <italic>t</italic><sub><italic>n</italic></sub> &#x0003D; <italic>n&#x00394;t</italic> and time step &#x00394;<italic>t</italic> and its temporal derivative &#x00394;<italic>V</italic><sub><italic>n</italic></sub> &#x0003D; (<italic>V</italic>(<italic>t</italic><sub><italic>n</italic>&#x0002B;1</sub>)&#x02212;<italic>V</italic>(<italic>t</italic><sub><italic>n</italic></sub>))/&#x00394;<italic>t</italic>. Then minimizing the cost function <inline-formula><mml:math id="M35"><mml:mi>C</mml:mi><mml:mo>=</mml:mo><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>j</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>T</mml:mi></mml:mrow></mml:munderover><mml:msup><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>&#x00394;</mml:mi><mml:msub><mml:mrow><mml:mi>V</mml:mi></mml:mrow><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>-</mml:mo><mml:mi>F</mml:mi><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:mi>V</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:mi>b</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mo>]</mml:mo></mml:mrow></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:math></inline-formula> with respect to <italic>b</italic> yields an optimum SGS-factor <italic>b</italic>. The corresponding parameter search is done by a Particle Swarm Optimization (PSO) [<xref ref-type="bibr" rid="B68">68</xref>, <xref ref-type="bibr" rid="B69">69</xref>]. <xref ref-type="fig" rid="F2">Figure 2A</xref> shows the simulated mesoscopic network mean and the well fit mesoscopic model dynamics (7) for the optimal factor <italic>b</italic> at different corresponding nonlinearity factors &#x003B2; (numbers given). For each factor &#x003B2; there is an optimal SGS-factor <italic>b</italic> (cf. <xref ref-type="fig" rid="F2">Figure 2B</xref>), and we can fit numerically a nonlinear dependency of <italic>b</italic> subjected to &#x003B2;</p>
<disp-formula id="E20"><label>(8)</label><mml:math id="M36"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>b</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>&#x003B2;</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mn>0</mml:mn><mml:mo>.</mml:mo><mml:mn>45</mml:mn><mml:mo>&#x0002B;</mml:mo><mml:mn>0</mml:mn><mml:mo>.</mml:mo><mml:mn>55</mml:mn><mml:msup><mml:mrow><mml:mi>e</mml:mi></mml:mrow><mml:mrow><mml:mn>6</mml:mn><mml:mo>.</mml:mo><mml:mn>78</mml:mn><mml:mi>&#x003B2;</mml:mi></mml:mrow></mml:msup><mml:mtext>&#x000A0;</mml:mtext><mml:mo>.</mml:mo></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<fig id="F2" position="float">
<label>Figure 2</label>
<caption><p>Subgrid-scale modeling. <bold>(A)</bold> Network mean (solid line) computed from simulations of Equation (1) and mean-field dynamics (7) for different values of &#x003B2; (numbers given) and corresponding optimally estimated factors <italic>b</italic>. Parameters are <italic>V</italic><sub><italic>i</italic></sub>(0) &#x0003D; <italic>x</italic><sub>1</sub>(0) &#x0003D; 1.1, &#x003B4;<italic>t</italic> &#x0003D; 0.03, <italic>K</italic><sub>0</sub> &#x0003D; 1.0, <italic>D</italic> &#x0003D; 0.33, &#x003B3;<sub>1</sub> &#x0003D; 2.0, &#x003B3;<sub>2</sub> &#x0003D; 0.0,  &#x003B3;<sub>3</sub> &#x0003D; &#x02212;1.0, &#x003B1; &#x0003D; 0.0, <italic>N</italic> &#x0003D; 1000, <italic>c</italic> &#x0003D; 0.9 and initial condition <italic>V</italic><sub><italic>i</italic></sub>(0) &#x0003D; 1.1 &#x02200;<italic>i</italic> &#x0003D; 1, &#x02026;, <italic>N</italic>. The results have been gained for a single network realization, while the variance of results for multiple network realizations is found to be negligible for <italic>N</italic> &#x02265; 1000 (data not shown). The optimal factor <italic>b</italic> has been estimated by employing the Python library <italic>pyswarms</italic> utilizing the routine <italic>single.GlobalBestPSO</italic> with optional parameters <italic>c</italic><sub>1</sub> &#x0003D; 0.5, <italic>c</italic><sub>2</sub> &#x0003D; 0.5, <italic>w</italic> &#x0003D; 0.3, 60 particles, 50 iterations and taking the best fit over 20 trials. <bold>(B)</bold> The estimated factor <italic>b</italic> for different values of &#x003B2; (dots) and the fit polynomial function (dashed line; see Equation 8). <bold>(C)</bold> The resulting potential &#x003D5;(<italic>V</italic>) with <italic>d&#x003D5;</italic>/<italic>dV</italic> &#x0003D; &#x02212;<italic>F</italic>[<italic>V, b</italic>(&#x003B2;)], <italic>F</italic> is taken from Equation (7), where <italic>b</italic> is computed from &#x003B2; (numbers given) by Equation (8). The dashed line is plotted for comparison illustrating the impact of the SGS.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fams-08-879866-g0002.tif"/>
</fig>
<p>This expression is the major result of the SGS modeling since it permits to describe the random network dynamics with nonlinear local interactions by mesoscopic variables with the assistance of a numerical optimization. It is noted that Equation (8) is still valid if other initial conditions for the network simulation are chosen (not shown).</p>
<p>The impact of the nonlinear local interaction on the mean-field is illustrated in <xref ref-type="fig" rid="F2">Figure 2C</xref>, where the potential &#x003D5;(<italic>V</italic> &#x0003D; <italic>x</italic><sub>1</sub>) is shown for different factors &#x003B2; implying optimum SGS-factor (different colors in <xref ref-type="fig" rid="F2">Figure 2C</xref>). For comparison, neglecting the SGS model correction (for &#x003B2; &#x0003D; &#x02212;1.0, dashed line in <xref ref-type="fig" rid="F2">Figure 2C</xref>) yields a potential and dynamical evolution that is different to the true SGS-optimized potential (red line in <xref ref-type="fig" rid="F2">Figure 2C</xref>).</p></sec>
<sec sec-type="discussion" id="s4">
<title>Discussion</title>
<p>A growing number of studies indicate that additive noise represents an important ingredient to systems dynamics. Since such an effect has not been well-studied yet and should be explored in the coming years, it is tempting to name it and pass it the new acronym ANISE. Not only to propose to embrace and name diverse research areas, the present work sketches two future directions of research. Section Dynamic Random Network Models shows in a generalized random network model study that additive noise may change the bifurcation point of the systems stationary state in accordance to previous work on stochastic bifurcations [<xref ref-type="bibr" rid="B15">15</xref>, <xref ref-type="bibr" rid="B17">17</xref>, <xref ref-type="bibr" rid="B19">19</xref>, <xref ref-type="bibr" rid="B20">20</xref>]. The example reveals that this effect results from nonlinear and not from linear interactions. In simple words, the additive noise tunes the system by multiplicative noise through the backdoor. Identifying certain system modes, the system exhibits an additive noise effect if some modes are coupled nonlinearly and some of these modes are stochastic. Such modes may be Fourier modes [<xref ref-type="bibr" rid="B19">19</xref>], eigenmodes of a coupling matrix [<xref ref-type="bibr" rid="B60">60</xref>] (as seen in Section Dynamic Random Network Models) or of a delayed linear operator [<xref ref-type="bibr" rid="B70">70</xref>]. Since diverse nonlinear interaction types render a systems analysis complex, Section Subgrid-scale modeling (SGS) proposes a perspective combination of SGS and optimal parameter estimation [<xref ref-type="bibr" rid="B71">71</xref>] in the context of ANISE. This combination permits to estimate numerically unknown contributions of subgrid-scale dynamics to larger scales. This may be very useful in studies of high-dimensional nonlinear models, whose dynamics sometimes appears to be untractable analytically.</p></sec>
<sec sec-type="data-availability" id="s5">
<title>Data Availability Statement</title>
<p>The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.</p></sec>
<sec id="s6">
<title>Author Contributions</title>
<p>The author has conceived the work, has performed all simulations and calculations, and has written the manuscript.</p></sec>
<sec sec-type="funding-information" id="s7">
<title>Funding</title>
<p>This work has been supported by the INRIA Action Exploratoire A/D Drugs.</p></sec>
<sec sec-type="COI-statement" id="conf1">
<title>Conflict of Interest</title>
<p>The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p></sec>
<sec sec-type="disclaimer" id="s8">
<title>Publisher&#x00027;s Note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p></sec>
</body>
<back>
<ack><p>The author thanks the reviewer for valuable comments.</p>
</ack>
<ref-list>
<title>References</title>
<ref id="B1">
<label>1.</label>
<citation citation-type="book"><person-group person-group-type="editor"><name><surname>Lasota</surname> <given-names>A</given-names></name> <name><surname>Makey</surname> <given-names>MC</given-names></name></person-group> editors. <source>Chaos, Fractals, and Noise - Stochastic Aspects of Dynamics. Vol. 94 of Applied Mathematical Sciences</source>. <publisher-loc>New York, NY</publisher-loc>: <publisher-name>Springer</publisher-name> (<year>1994</year>). <pub-id pub-id-type="doi">10.1007/978-1-4612-4286-4</pub-id></citation></ref>
<ref id="B2">
<label>2.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bhar</surname> <given-names>B</given-names></name> <name><surname>Khanna</surname> <given-names>A</given-names></name> <name><surname>Parihar</surname> <given-names>A</given-names></name> <name><surname>Datta</surname> <given-names>S</given-names></name> <name><surname>Raychowdhury</surname> <given-names>A</given-names></name></person-group>. <article-title>Stochastic resonance in insulator-metal-transition systems</article-title>. <source>Sci Rep.</source> (<year>2020</year>) <volume>10</volume>:<fpage>5549</fpage>. <pub-id pub-id-type="doi">10.1038/s41598-020-62537-3</pub-id><pub-id pub-id-type="pmid">32218495</pub-id></citation></ref>
<ref id="B3">
<label>3.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kurita</surname> <given-names>Y</given-names></name> <name><surname>Shinohara</surname> <given-names>M</given-names></name> <name><surname>Ueda</surname> <given-names>J</given-names></name></person-group>. <article-title>Wearable sensorimotor enhancer for fingertip based on stochastic resonance effect</article-title>. <source>IEEE Trans Hum Mach Syst</source>. (<year>2013</year>) <volume>43</volume>:<fpage>333</fpage>&#x02013;<lpage>7</lpage>. <pub-id pub-id-type="doi">10.1109/TSMC.2013.2242886</pub-id></citation></ref>
<ref id="B4">
<label>4.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gammaitoni</surname> <given-names>L</given-names></name> <name><surname>Hanggi</surname> <given-names>P</given-names></name> <name><surname>Jung</surname> <given-names>P</given-names></name></person-group>. <article-title>Stochastic resonance</article-title>. <source>RevModPhys</source>. (<year>1998</year>) <volume>70</volume>:<fpage>223</fpage>&#x02013;<lpage>87</lpage>. <pub-id pub-id-type="doi">10.1103/RevModPhys.70.223</pub-id></citation></ref>
<ref id="B5">
<label>5.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wiesenfeld</surname> <given-names>K</given-names></name> <name><surname>Moss</surname> <given-names>F</given-names></name></person-group>. <article-title>Stochastic resonance and the benefits of noise: from ice ages to crayfish and SQUIDs</article-title>. <source>Nature</source>. (<year>1995</year>) <volume>373</volume>:<fpage>33</fpage>&#x02013;<lpage>6</lpage>. <pub-id pub-id-type="doi">10.1038/373033a0</pub-id><pub-id pub-id-type="pmid">7800036</pub-id></citation></ref>
<ref id="B6">
<label>6.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nobukawa</surname> <given-names>S</given-names></name> <name><surname>Nishimura</surname> <given-names>H</given-names></name> <name><surname>Wagatsuma</surname> <given-names>N</given-names></name> <name><surname>Inagaki</surname> <given-names>K</given-names></name> <name><surname>Yamanishi</surname> <given-names>T</given-names></name> <name><surname>Takahashi</surname> <given-names>T</given-names></name></person-group>. <article-title>Recent trends of controlling chaotic resonance and future perspectives</article-title>. <source>Front Appl Math Stat</source>. (<year>2021</year>) <volume>7</volume>:<fpage>760568</fpage>. <pub-id pub-id-type="doi">10.3389/fams.2021.760568</pub-id></citation></ref>
<ref id="B7">
<label>7.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pikovsky</surname> <given-names>AS</given-names></name> <name><surname>Kurths</surname> <given-names>J</given-names></name></person-group>. <article-title>Coherence resonance in a noise-driven excitable system</article-title>. <source>Phys Rev Lett</source>. (<year>1997</year>) <volume>78</volume>:<fpage>775</fpage>&#x02013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1103/PhysRevLett.78.775</pub-id></citation></ref>
<ref id="B8">
<label>8.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gang</surname> <given-names>H</given-names></name> <name><surname>Ditzinger</surname> <given-names>T</given-names></name> <name><surname>Ning</surname> <given-names>CZ</given-names></name> <name><surname>Haken</surname> <given-names>H</given-names></name></person-group>. <article-title>Stochastic resonance without external periodic force</article-title>. <source>Phys Rev Lett</source>. (<year>1993</year>) <volume>71</volume>:<fpage>807</fpage>&#x02013;<lpage>10</lpage>. <pub-id pub-id-type="doi">10.1103/PhysRevLett.71.807</pub-id><pub-id pub-id-type="pmid">10055373</pub-id></citation></ref>
<ref id="B9">
<label>9.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Beato</surname> <given-names>V</given-names></name> <name><surname>Sendina-Nadal</surname> <given-names>I</given-names></name> <name><surname>Gerdes</surname> <given-names>I</given-names></name> <name><surname>Engel</surname> <given-names>H</given-names></name></person-group>. <article-title>Coherence resonance in a chemical excitable system driven by coloured noise</article-title>. <source>Philos Trans A Math Phys Eng Sci</source>. (<year>2008</year>) <volume>366</volume>:<fpage>381</fpage>&#x02013;<lpage>95</lpage>. <pub-id pub-id-type="doi">10.1098/rsta.2007.2096</pub-id><pub-id pub-id-type="pmid">17673411</pub-id></citation></ref>
<ref id="B10">
<label>10.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gu</surname> <given-names>H</given-names></name> <name><surname>Yang</surname> <given-names>M</given-names></name> <name><surname>Li</surname> <given-names>L</given-names></name> <name><surname>Liu</surname> <given-names>Z</given-names></name> <name><surname>Ren</surname> <given-names>W</given-names></name></person-group>. <article-title>Experimental observation of the stochastic bursting caused by coherence resonance in a neural pacemaker</article-title>. <source>Neuroreport</source>. (<year>2002</year>) <volume>13</volume>:<fpage>1657</fpage>&#x02013;<lpage>60</lpage>. <pub-id pub-id-type="doi">10.1097/00001756-200209160-00018</pub-id><pub-id pub-id-type="pmid">12352622</pub-id></citation></ref>
<ref id="B11">
<label>11.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lee</surname> <given-names>CY</given-names></name> <name><surname>Choi</surname> <given-names>W</given-names></name> <name><surname>Han</surname> <given-names>JH</given-names></name> <name><surname>Strano</surname> <given-names>MS</given-names></name></person-group>. <article-title>Coherence resonance in a single-walled carbon nanotube ion channel</article-title>. <source>Science</source>. (<year>2010</year>) <volume>329</volume>:<fpage>1320</fpage>&#x02013;<lpage>4</lpage>. <pub-id pub-id-type="doi">10.1126/science.1193383</pub-id><pub-id pub-id-type="pmid">20829480</pub-id></citation></ref>
<ref id="B12">
<label>12.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mompo</surname> <given-names>E</given-names></name> <name><surname>Ruiz-Garcia</surname> <given-names>M</given-names></name> <name><surname>Carretero</surname> <given-names>M</given-names></name> <name><surname>Grahn</surname> <given-names>HT</given-names></name> <name><surname>Zhang</surname> <given-names>Y</given-names></name> <name><surname>Bonilla</surname> <given-names>LL</given-names></name></person-group>. <article-title>Coherence resonance and stochastic resonance in an excitable semiconductor superlattice</article-title>. <source>Phys Rev Lett</source>. (<year>2018</year>) <volume>121</volume>:<fpage>086805</fpage>. <pub-id pub-id-type="doi">10.1103/PhysRevLett.121.086805</pub-id><pub-id pub-id-type="pmid">30192625</pub-id></citation></ref>
<ref id="B13">
<label>13.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>T&#x000F6;njes</surname> <given-names>R</given-names></name> <name><surname>Fiore</surname> <given-names>CE</given-names></name> <name><surname>Pereira</surname> <given-names>T</given-names></name></person-group>. <article-title>Coherence resonance in influencer networks</article-title>. <source>Nat Commun</source>. (<year>2021</year>) <volume>12</volume>:<fpage>72</fpage>. <pub-id pub-id-type="doi">10.1038/s41467-020-20441-4</pub-id><pub-id pub-id-type="pmid">33398017</pub-id></citation></ref>
<ref id="B14">
<label>14.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>He</surname> <given-names>GYZF</given-names></name> <name><surname>Li</surname> <given-names>JC</given-names></name> <name><surname>Mei</surname> <given-names>DC</given-names></name> <name><surname>Tang</surname> <given-names>NS</given-names></name></person-group>. <article-title>Coherence resonance-like and efficiency of financial market</article-title>. <source>Phys A</source>. (<year>2019</year>) <volume>534</volume>:<fpage>122327</fpage>. <pub-id pub-id-type="doi">10.1016/j.physa.2019.122327</pub-id></citation></ref>
<ref id="B15">
<label>15.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sri Namachchivaya</surname> <given-names>N</given-names></name></person-group>. <article-title>Stochastic bifurcation</article-title>. <source>Appl Math Comput</source>. (<year>1990</year>) <volume>38</volume>:<fpage>101</fpage>&#x02013;<lpage>59</lpage>. <pub-id pub-id-type="doi">10.1016/0096-3003(90)90051-4</pub-id></citation></ref>
<ref id="B16">
<label>16.</label>
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Arnold</surname> <given-names>L</given-names></name> <name><surname>Boxler</surname> <given-names>P</given-names></name></person-group>. <article-title>Stochastic bifurcation: instructive examples in dimension one</article-title>. In: <person-group person-group-type="editor"><name><surname>Pinsky</surname> <given-names>MA</given-names></name> <name><surname>Wihstutz</surname> <given-names>V</given-names></name></person-group>, editors. <source>Diffusion Processes and Related Problems in Analysis, Volume II: Stochastic Flows</source>. <publisher-loc>Boston, MA</publisher-loc>: <publisher-name>Birkh&#x000E4;user Boston</publisher-name> (<year>1992</year>). p. <fpage>241</fpage>&#x02013;<lpage>55</lpage>. <pub-id pub-id-type="doi">10.1007/978-1-4612-0389-6_10</pub-id></citation></ref>
<ref id="B17">
<label>17.</label>
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Arnold</surname> <given-names>L</given-names></name></person-group>. <source>Random Dynamical Systems</source>. <publisher-loc>Berlin</publisher-loc>: <publisher-name>Springer-Verlag</publisher-name> (<year>1998</year>). <pub-id pub-id-type="doi">10.1007/978-3-662-12878-7</pub-id></citation></ref>
<ref id="B18">
<label>18.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Xu</surname> <given-names>C</given-names></name> <name><surname>Roberts</surname> <given-names>AJ</given-names></name></person-group>. <article-title>On the low-dimensional modelling of Stratonovich stochastic differential equations</article-title>. <source>Phys A</source>. (<year>1996</year>) <volume>225</volume>:<fpage>62</fpage>&#x02013;<lpage>80</lpage>. <pub-id pub-id-type="doi">10.1016/0378-4371(95)00387-8</pub-id></citation></ref>
<ref id="B19">
<label>19.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Longtin</surname> <given-names>A</given-names></name> <name><surname>Schimansky-Geier</surname> <given-names>L</given-names></name></person-group>. <article-title>Additive global noise delays Turing bifurcations</article-title>. <source>Phys Rev Lett</source>. (<year>2007</year>) <volume>98</volume>:<fpage>230601</fpage>. <pub-id pub-id-type="doi">10.1103/PhysRevLett.98.230601</pub-id><pub-id pub-id-type="pmid">17677891</pub-id></citation></ref>
<ref id="B20">
<label>20.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Longtin</surname> <given-names>A</given-names></name> <name><surname>Schimansky-Geier</surname> <given-names>L</given-names></name></person-group>. <article-title>Additive noise-induced Turing transitions in spatial systems with application to neural fields and the Swift-Hohenberg equation</article-title>. <source>Phys D</source>. (<year>2008</year>) <volume>237</volume>:<fpage>755</fpage>&#x02013;<lpage>73</lpage>. <pub-id pub-id-type="doi">10.1016/j.physd.2007.10.013</pub-id></citation></ref>
<ref id="B21">
<label>21.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hutt</surname> <given-names>A</given-names></name></person-group>. <article-title>Additive noise may change the stability of nonlinear systems</article-title>. <source>Europhys Lett</source>. (<year>2008</year>) <volume>84</volume>:<fpage>34003</fpage>. <pub-id pub-id-type="doi">10.1209/0295-5075/84/34003</pub-id><pub-id pub-id-type="pmid">16605630</pub-id></citation></ref>
<ref id="B22">
<label>22.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pradas</surname> <given-names>M</given-names></name> <name><surname>Tseluiko</surname> <given-names>D</given-names></name> <name><surname>Kalliadasis</surname> <given-names>S</given-names></name> <name><surname>Papageorgiou</surname> <given-names>DT</given-names></name> <name><surname>Pavliotis</surname> <given-names>GA</given-names></name></person-group>. <article-title>Noise induced state transitions, intermittency and universality in the noisy Kuramoto-Sivashinsky equation</article-title>. <source>Phys Rev Lett</source>. (<year>2011</year>) <volume>106</volume>:<fpage>060602</fpage>. <pub-id pub-id-type="doi">10.1103/PhysRevLett.106.060602</pub-id><pub-id pub-id-type="pmid">21405452</pub-id></citation></ref>
<ref id="B23">
<label>23.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bloemker</surname> <given-names>D</given-names></name> <name><surname>Hairer</surname> <given-names>M</given-names></name> <name><surname>Pavliotis</surname> <given-names>GA</given-names></name></person-group>. <article-title>Modulation equations: stochastic bifurcation in large domains</article-title>. <source>Commun Math Phys</source>. (<year>2005</year>) <volume>258</volume>:<fpage>479</fpage>&#x02013;<lpage>512</lpage>. <pub-id pub-id-type="doi">10.1007/s00220-005-1368-8</pub-id></citation></ref>
<ref id="B24">
<label>24.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Guillouzic</surname> <given-names>S</given-names></name> <name><surname>L&#x00027;Heureux</surname> <given-names>I</given-names></name> <name><surname>Longtin</surname> <given-names>A</given-names></name></person-group>. <article-title>Small delay approximation of stochastic delay differential equation</article-title>. <source>Phys Rev E</source>. (<year>1999</year>) <volume>59</volume>:<fpage>3970</fpage>. <pub-id pub-id-type="doi">10.1103/PhysRevE.59.3970</pub-id><pub-id pub-id-type="pmid">11031533</pub-id></citation></ref>
<ref id="B25">
<label>25.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lefebvre</surname> <given-names>J</given-names></name> <name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>LeBlanc</surname> <given-names>VG</given-names></name> <name><surname>Longtin</surname> <given-names>A</given-names></name></person-group>. <article-title>Reduced dynamics for delayed systems with harmonic or stochastic forcing</article-title>. <source>Chaos</source>. (<year>2012</year>) <volume>22</volume>:<fpage>043121</fpage>. <pub-id pub-id-type="doi">10.1063/1.4760250</pub-id><pub-id pub-id-type="pmid">23278056</pub-id></citation></ref>
<ref id="B26">
<label>26.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Lefebvre</surname> <given-names>J</given-names></name> <name><surname>Longtin</surname> <given-names>A</given-names></name></person-group>. <article-title>Delay stabilizes stochastic systems near an non-oscillatory instability</article-title>. <source>Europhys Lett</source>. (<year>2012</year>) <volume>98</volume>:<fpage>20004</fpage>. <pub-id pub-id-type="doi">10.1209/0295-5075/98/20004</pub-id></citation></ref>
<ref id="B27">
<label>27.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Lefebvre</surname> <given-names>J</given-names></name></person-group>. <article-title>Stochastic center manifold analysis in scalar nonlinear systems involving distributed delays and additive noise</article-title>. <source>Markov Proc Rel Fields</source>. (<year>2016</year>) <volume>22</volume>:<fpage>555</fpage>&#x02013;<lpage>72</lpage>.</citation></ref>
<ref id="B28">
<label>28.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>McDonnell</surname> <given-names>MD</given-names></name> <name><surname>Ward</surname> <given-names>LM</given-names></name></person-group>. <article-title>The benefits of noise in neural systems: bridging theory and experiment</article-title>. <source>Nat Rev Neurosci</source>. (<year>2011</year>) <volume>12</volume>:<fpage>415</fpage>&#x02013;<lpage>26</lpage>. <pub-id pub-id-type="doi">10.1038/nrn3061</pub-id><pub-id pub-id-type="pmid">21685932</pub-id></citation></ref>
<ref id="B29">
<label>29.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Eggert</surname> <given-names>T</given-names></name> <name><surname>Henriques</surname> <given-names>DYP</given-names></name> <name><surname>Hart</surname> <given-names>BM</given-names></name> <name><surname>Straube</surname> <given-names>A</given-names></name></person-group>. <article-title>Modeling inter-trial variability of pointing movements during visuomotor adaptation</article-title>. <source>Biol Cybern</source>. (<year>2021</year>) <volume>115</volume>:<fpage>59</fpage>&#x02013;<lpage>86</lpage>. <pub-id pub-id-type="doi">10.1007/s00422-021-00858-w</pub-id><pub-id pub-id-type="pmid">33575896</pub-id></citation></ref>
<ref id="B30">
<label>30.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fox</surname> <given-names>MD</given-names></name> <name><surname>Snyder</surname> <given-names>AZ</given-names></name> <name><surname>Vincent</surname> <given-names>JL</given-names></name> <name><surname>Raichle</surname> <given-names>ME</given-names></name></person-group>. <article-title>Intrinsic fluctuations within cortical systems account for intertrial variability in human behavior</article-title>. <source>Neuron</source>. (<year>2007</year>) <volume>56</volume>:<fpage>171</fpage>&#x02013;<lpage>84</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuron.2007.08.023</pub-id><pub-id pub-id-type="pmid">17920023</pub-id></citation></ref>
<ref id="B31">
<label>31.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wolff</surname> <given-names>A</given-names></name> <name><surname>Chen</surname> <given-names>L</given-names></name> <name><surname>Tumati</surname> <given-names>S</given-names></name> <name><surname>Golesorkhi</surname> <given-names>M</given-names></name> <name><surname>Gomez-Pilar</surname> <given-names>J</given-names></name> <name><surname>Hu</surname> <given-names>J</given-names></name> <etal/></person-group>. <article-title>Prestimulus dynamics blend with the stimulus in neural variability quenching</article-title>. <source>NeuroImage</source>. (<year>2021</year>) <volume>238</volume>:<fpage>118160</fpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2021.118160</pub-id><pub-id pub-id-type="pmid">34058331</pub-id></citation></ref>
<ref id="B32">
<label>32.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sadaghiani</surname> <given-names>S</given-names></name> <name><surname>Hesselmann</surname> <given-names>G</given-names></name> <name><surname>Friston</surname> <given-names>KJ</given-names></name> <name><surname>Kleinschmidt</surname> <given-names>A</given-names></name></person-group>. <article-title>The relation of ongoing brain activity, evoked neural responses, and cognition</article-title>. <source>Front Syst Neurosci</source>. (<year>2010</year>) <volume>4</volume>:<fpage>20</fpage>. <pub-id pub-id-type="doi">10.3389/fnsys.2010.00020</pub-id><pub-id pub-id-type="pmid">20631840</pub-id></citation></ref>
<ref id="B33">
<label>33.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Krishnan</surname> <given-names>GP</given-names></name> <name><surname>Gonzalez</surname> <given-names>OC</given-names></name> <name><surname>Bazhenov</surname> <given-names>M</given-names></name></person-group>. <article-title>Origin of slow spontaneous resting-state neuronal fluctuations in brain networks</article-title>. <source>Proc Natl Acad Sci USA</source>. (<year>2018</year>) <volume>115</volume>:<fpage>6858</fpage>&#x02013;<lpage>63</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.1715841115</pub-id><pub-id pub-id-type="pmid">29884650</pub-id></citation></ref>
<ref id="B34">
<label>34.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chew</surname> <given-names>B</given-names></name> <name><surname>Hauser</surname> <given-names>TU</given-names></name> <name><surname>Papoutsi</surname> <given-names>M</given-names></name> <name><surname>Magerkurth</surname> <given-names>J</given-names></name> <name><surname>Dolan</surname> <given-names>RJ</given-names></name> <name><surname>Rutledge</surname> <given-names>RB</given-names></name></person-group>. <article-title>Endogenous fluctuations in the dopaminergic midbrain drive behavioral choice variability</article-title>. <source>Proc Natl Acad Sci USA</source>. (<year>2019</year>) <volume>116</volume>:<fpage>18732</fpage>&#x02013;<lpage>7</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.1900872116</pub-id><pub-id pub-id-type="pmid">31451671</pub-id></citation></ref>
<ref id="B35">
<label>35.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Stringer</surname> <given-names>C</given-names></name> <name><surname>Pachitariu</surname> <given-names>M</given-names></name> <name><surname>Steinmetz</surname> <given-names>NA</given-names></name> <name><surname>Okun</surname> <given-names>M</given-names></name> <name><surname>Bartho</surname> <given-names>P</given-names></name> <name><surname>Harris</surname> <given-names>KD</given-names></name> <etal/></person-group>. <article-title>Inhibitory control of correlated intrinsic variability in cortical networks</article-title>. <source>eLife</source>. (<year>2016</year>) <volume>5</volume>:<fpage>e19695</fpage>. <pub-id pub-id-type="doi">10.7554/eLife.19695</pub-id><pub-id pub-id-type="pmid">27926356</pub-id></citation></ref>
<ref id="B36">
<label>36.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sancristobal</surname> <given-names>B</given-names></name> <name><surname>Ferri</surname> <given-names>F</given-names></name> <name><surname>Perucci</surname> <given-names>MG</given-names></name> <name><surname>Romani</surname> <given-names>GL</given-names></name> <name><surname>Longtin</surname> <given-names>A</given-names></name> <name><surname>Northoff</surname> <given-names>G</given-names></name></person-group>. <article-title>Slow resting state fluctuations enhance neuronal and behavioral responses to looming sounds</article-title>. <source>Brain Top</source>. (<year>2021</year>) <volume>35</volume>:<fpage>121</fpage>&#x02013;<lpage>41</lpage>. <pub-id pub-id-type="doi">10.1007/s10548-021-00826-4</pub-id><pub-id pub-id-type="pmid">33768383</pub-id></citation></ref>
<ref id="B37">
<label>37.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Orlandini</surname> <given-names>E</given-names></name></person-group>. <source>Multiplicative Noise</source>. Unpublished lecture notes on Physics in Complex Systems, University of Padova (<year>2012</year>).</citation></ref>
<ref id="B38">
<label>38.</label>
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Gardiner</surname> <given-names>CW</given-names></name></person-group>. <source>Handbook of Stochastic Methods</source>. <publisher-loc>Berlin</publisher-loc>: <publisher-name>Springer</publisher-name> (<year>2004</year>). <pub-id pub-id-type="doi">10.1007/978-3-662-05389-8</pub-id></citation></ref>
<ref id="B39">
<label>39.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sagues</surname> <given-names>F</given-names></name> <name><surname>Sancho</surname> <given-names>JM</given-names></name> <name><surname>Garcia-Ojalvo</surname> <given-names>J</given-names></name></person-group>. <article-title>Spatiotemporal order out of noise</article-title>. <source>RevModPhys</source>. (<year>2007</year>) <volume>79</volume>:<fpage>829</fpage>&#x02013;<lpage>82</lpage>. <pub-id pub-id-type="doi">10.1103/RevModPhys.79.829</pub-id></citation></ref>
<ref id="B40">
<label>40.</label>
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Garcia-Ojalvo</surname> <given-names>J</given-names></name> <name><surname>Sancho</surname> <given-names>JM</given-names></name></person-group>. <source>Noise in Spatially Extended Systems</source>. <publisher-loc>New York, NY</publisher-loc>: <publisher-name>Springer</publisher-name> (<year>1999</year>). <pub-id pub-id-type="doi">10.1007/978-1-4612-1536-3</pub-id></citation></ref>
<ref id="B41">
<label>41.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Lefebvre</surname> <given-names>J</given-names></name></person-group>. <article-title>Additive noise tunes the self-organization in complex systems</article-title>. In: <person-group person-group-type="editor"><name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Haken</surname> <given-names>H</given-names></name></person-group>, editors. <source>Synergetics</source>. Encyclopedia of Complexity and Systems Science Series. <publisher-loc>New York, NY</publisher-loc>: <publisher-name>Springer</publisher-name> (<year>2020</year>). p. <fpage>183</fpage>&#x02013;<lpage>96</lpage>. <pub-id pub-id-type="doi">10.1007/978-1-0716-0421-2_696</pub-id></citation></ref>
<ref id="B42">
<label>42.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Mierau</surname> <given-names>A</given-names></name> <name><surname>Lefebvre</surname> <given-names>J</given-names></name></person-group>. <article-title>Dynamic control of synchronous activity in networks of spiking neurons</article-title>. <source>PLoS ONE</source>. (<year>2016</year>) <volume>11</volume>:<fpage>e0161488</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0161488</pub-id><pub-id pub-id-type="pmid">27669018</pub-id></citation></ref>
<ref id="B43">
<label>43.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Lefebvre</surname> <given-names>J</given-names></name> <name><surname>Hight</surname> <given-names>D</given-names></name> <name><surname>Sleigh</surname> <given-names>J</given-names></name></person-group>. <article-title>Suppression of underlying neuronal fluctuations mediates EEG slowing during general anaesthesia</article-title>. <source>Neuroimage</source>. (<year>2018</year>) <volume>179</volume>:<fpage>414</fpage>&#x02013;<lpage>28</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2018.06.043</pub-id><pub-id pub-id-type="pmid">29920378</pub-id></citation></ref>
<ref id="B44">
<label>44.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lefebvre</surname> <given-names>J</given-names></name> <name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Knebel</surname> <given-names>JF</given-names></name> <name><surname>Whittingstall</surname> <given-names>K</given-names></name> <name><surname>Murray</surname> <given-names>M</given-names></name></person-group>. <article-title>Stimulus statistics shape oscillations in nonlinear recurrent neural networks</article-title>. <source>J Neurosci</source>. (<year>2015</year>) <volume>35</volume>:<fpage>2895</fpage>&#x02013;<lpage>903</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.3609-14.2015</pub-id><pub-id pub-id-type="pmid">25698729</pub-id></citation></ref>
<ref id="B45">
<label>45.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lefebvre</surname> <given-names>J</given-names></name> <name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Frohlich</surname> <given-names>F</given-names></name></person-group>. <article-title>Stochastic resonance mediates the state-dependent effect of periodic stimulation on cortical alpha oscillations</article-title>. <source>eLife</source>. (<year>2017</year>) <volume>6</volume>:<fpage>e32054</fpage>. <pub-id pub-id-type="doi">10.7554/eLife.32054</pub-id><pub-id pub-id-type="pmid">29280733</pub-id></citation></ref>
<ref id="B46">
<label>46.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rich</surname> <given-names>S</given-names></name> <name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Skinner</surname> <given-names>FK</given-names></name> <name><surname>Valiante</surname> <given-names>TA</given-names></name> <name><surname>Lefebvre</surname> <given-names>J</given-names></name></person-group>. <article-title>Neurostimulation stabilizes spiking neural networks by disrupting seizure-like oscillatory transitions</article-title>. <source>Sci Rep</source>. (<year>2020</year>) <volume>10</volume>:<fpage>15408</fpage>. <pub-id pub-id-type="doi">10.1038/s41598-020-72335-6</pub-id><pub-id pub-id-type="pmid">32958802</pub-id></citation></ref>
<ref id="B47">
<label>47.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Herrmann</surname> <given-names>CS</given-names></name> <name><surname>Murray</surname> <given-names>MM</given-names></name> <name><surname>Ionta</surname> <given-names>S</given-names></name> <name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Lefebvre</surname> <given-names>J</given-names></name></person-group>. <article-title>Shaping intrinsic neural oscillations with periodic stimulation</article-title>. <source>J Neurosci</source>. (<year>2016</year>) <volume>36</volume>:<fpage>5328</fpage>&#x02013;<lpage>39</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.0236-16.2016</pub-id><pub-id pub-id-type="pmid">27170129</pub-id></citation></ref>
<ref id="B48">
<label>48.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Powanwe</surname> <given-names>AS</given-names></name> <name><surname>Longtin</surname> <given-names>A</given-names></name></person-group>. <article-title>Determinants of brain rhythm burst statistics</article-title>. <source>Sci Rep</source>. (<year>2021</year>) <volume>9</volume>:<fpage>18335</fpage>. <pub-id pub-id-type="doi">10.1038/s41598-019-54444-z</pub-id><pub-id pub-id-type="pmid">31797877</pub-id></citation></ref>
<ref id="B49">
<label>49.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Powanwe</surname> <given-names>AS</given-names></name> <name><surname>Longtin</surname> <given-names>A</given-names></name></person-group>. <article-title>Mechanisms of flexible information sharing through noisy oscillations</article-title>. <source>Biology</source>. (<year>2021</year>) <volume>10</volume>:<fpage>764</fpage>. <pub-id pub-id-type="doi">10.3390/biology10080764</pub-id><pub-id pub-id-type="pmid">34439996</pub-id></citation></ref>
<ref id="B50">
<label>50.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Lefebvre</surname> <given-names>J</given-names></name> <name><surname>Hight</surname> <given-names>D</given-names></name> <name><surname>Kaiser</surname> <given-names>H</given-names></name></person-group>. <article-title>Phase coherence induced by additive Gaussian and non-Gaussian noise in excitable networks with application to burst suppression-like brain signals</article-title>. <source>Front Appl Math Stat</source>. (<year>2020</year>) <volume>5</volume>:<fpage>69</fpage>. <pub-id pub-id-type="doi">10.3389/fams.2019.00069</pub-id></citation></ref>
<ref id="B51">
<label>51.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Lefebvre</surname> <given-names>J</given-names></name></person-group>. <article-title>Arousal fluctuations govern oscillatory transitions between dominant &#x003B3; and &#x003B1; occipital activity during eyes open/closed conditions</article-title>. <source>Brain Topogr</source>. (<year>2021</year>) <volume>35</volume>:<fpage>108</fpage>&#x02013;<lpage>20</lpage>. <pub-id pub-id-type="doi">10.1007/s10548-021-00855-z</pub-id><pub-id pub-id-type="pmid">34160731</pub-id></citation></ref>
<ref id="B52">
<label>52.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Wahl</surname> <given-names>T</given-names></name></person-group>. <article-title>Poisson-distributed noise induces cortical &#x003B3;-activity: explanation of &#x003B3;-enhancement by anaesthetics ketamine and propofol</article-title>. <source>J Phys Complex</source>. (<year>2022</year>) <volume>3</volume>:<fpage>015002</fpage>. <pub-id pub-id-type="doi">10.1088/2632-072X/ac4004</pub-id></citation></ref>
<ref id="B53">
<label>53.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Boland</surname> <given-names>RB</given-names></name> <name><surname>Galla</surname> <given-names>T</given-names></name> <name><surname>McKane</surname> <given-names>AJ</given-names></name></person-group>. <article-title>How limit cycles and quasi-cycles are related in systems with intrinsic noise</article-title>. <source>J Stat Mech</source>. (<year>2008</year>) <volume>2008</volume>:<fpage>P09001</fpage>. <pub-id pub-id-type="doi">10.1088/1742-5468/2008/09/P09001</pub-id></citation></ref>
<ref id="B54">
<label>54.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Greenwood</surname> <given-names>PE</given-names></name> <name><surname>Ward</surname> <given-names>LM</given-names></name></person-group>. <article-title>Rapidly forming, slowly evolving, spatial patterns from quasi-cycle Mexican Hat coupling</article-title>. <source>Math Biosci Eng</source>. (<year>2019</year>) <volume>16</volume>:<fpage>6769</fpage>&#x02013;<lpage>93</lpage>. <pub-id pub-id-type="doi">10.3934/mbe.2019338</pub-id><pub-id pub-id-type="pmid">31698587</pub-id></citation></ref>
<ref id="B55">
<label>55.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Powanwe</surname> <given-names>AS</given-names></name> <name><surname>Longtin</surname> <given-names>A</given-names></name></person-group>. <article-title>Phase dynamics of delay-coupled quasi-cycles with application to brain rhythms</article-title>. <source>Phys Rev Res</source>. (<year>2020</year>) <volume>2</volume>:<fpage>043067</fpage>. <pub-id pub-id-type="doi">10.1103/PhysRevResearch.2.043067</pub-id></citation></ref>
<ref id="B56">
<label>56.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hashemi</surname> <given-names>M</given-names></name> <name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Sleigh</surname> <given-names>J</given-names></name></person-group>. <article-title>How the cortico-thalamic feedback affects the EEG power spectrum over frontal and occipital regions during propofol-induced anaesthetic sedation</article-title>. <source>J Comput Neurosci.</source> (<year>2015</year>) <volume>39</volume>:<fpage>155</fpage>. <pub-id pub-id-type="doi">10.1007/s10827-015-0569-1</pub-id><pub-id pub-id-type="pmid">26256583</pub-id></citation></ref>
<ref id="B57">
<label>57.</label>
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Sleigh</surname> <given-names>JW</given-names></name> <name><surname>Voss</surname> <given-names>L</given-names></name> <name><surname>Steyn-Ross</surname> <given-names>ML</given-names></name> <name><surname>Steyn-Ross</surname> <given-names>DA</given-names></name> <name><surname>Wilson</surname> <given-names>MT</given-names></name></person-group>. <article-title>Modelling sleep and general anaesthesia</article-title>. In: <person-group person-group-type="editor"><name><surname>Hutt</surname> <given-names>A</given-names></name></person-group>, editor. <source>Sleep and Anesthesia: Neural correlates in Theory and Experiment</source>. <publisher-loc>New York, NY</publisher-loc>: <publisher-name>Springer</publisher-name> (<year>2011</year>). p. <fpage>21</fpage>&#x02013;<lpage>41</lpage>. <pub-id pub-id-type="doi">10.1007/978-1-4614-0173-5_2</pub-id></citation></ref>
<ref id="B58">
<label>58.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Purdon</surname> <given-names>PL</given-names></name> <name><surname>Pierce</surname> <given-names>ET</given-names></name> <name><surname>Mukamel</surname> <given-names>EA</given-names></name> <name><surname>Prerau</surname> <given-names>MJ</given-names></name> <name><surname>Walsh</surname> <given-names>JL</given-names></name> <name><surname>Wong</surname> <given-names>KF</given-names></name> <etal/></person-group>. <article-title>Electroencephalogram signatures of loss and recovery of consciousness from propofol</article-title>. <source>Proc Natl Acad Sci USA</source>. (<year>2012</year>) <volume>110</volume>:<fpage>E1142</fpage>-<lpage>1150</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.1221180110</pub-id><pub-id pub-id-type="pmid">23487781</pub-id></citation></ref>
<ref id="B59">
<label>59.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Erd&#x000F6;s</surname> <given-names>P</given-names></name> <name><surname>R&#x000E9;nyi</surname> <given-names>A</given-names></name></person-group>. <article-title>On random graphs. I</article-title>. <source>Publ Math</source>. (<year>1959</year>) <volume>6</volume>:<fpage>209</fpage>&#x02013;<lpage>97</lpage>.</citation></ref>
<ref id="B60">
<label>60.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Wahl</surname> <given-names>T</given-names></name> <name><surname>Voges</surname> <given-names>N</given-names></name> <name><surname>Hausmann</surname> <given-names>J</given-names></name> <name><surname>Lefebvre</surname> <given-names>J</given-names></name></person-group>. <article-title>Coherence resonance in random Erdos-Renyi neural networks : mean-field theory</article-title>. <source>Front Appl Math Stat</source>. (<year>2021</year>) <volume>7</volume>:<fpage>697904</fpage>. <pub-id pub-id-type="doi">10.3389/fams.2021.697904</pub-id></citation></ref>
<ref id="B61">
<label>61.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>F&#x000FC;redi</surname> <given-names>Z</given-names></name> <name><surname>Komlos</surname> <given-names>J</given-names></name></person-group>. <article-title>The eigenvalues of random symmetric matrices</article-title>. <source>Combinatorica</source>. (<year>1981</year>) <volume>1</volume>:<fpage>233</fpage>&#x02013;<lpage>41</lpage>. <pub-id pub-id-type="doi">10.1007/BF02579329</pub-id></citation></ref>
<ref id="B62">
<label>62.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yan</surname> <given-names>G</given-names></name> <name><surname>Martinez</surname> <given-names>ND</given-names></name> <name><surname>Liu</surname> <given-names>YY</given-names></name></person-group>. <article-title>Degree heterogeneity and stability of ecological networks</article-title>. <source>J R Soc Interface</source>. (<year>2017</year>) <volume>14</volume>:<fpage>20170189</fpage>. <pub-id pub-id-type="doi">10.1098/rsif.2017.0189</pub-id><pub-id pub-id-type="pmid">28637917</pub-id></citation></ref>
<ref id="B63">
<label>63.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Amaral</surname> <given-names>LAN</given-names></name> <name><surname>Scala</surname> <given-names>A</given-names></name> <name><surname>Barth&#x000E9;l&#x000E9;my</surname> <given-names>M</given-names></name> <name><surname>Stanley</surname> <given-names>HE</given-names></name></person-group>. <article-title>Classes of small-world networks</article-title>. <source>Proc Natl Acad Sci USA</source>. (<year>2000</year>) <volume>97</volume>:<fpage>11149</fpage>&#x02013;<lpage>52</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.200327197</pub-id><pub-id pub-id-type="pmid">11005838</pub-id></citation></ref>
<ref id="B64">
<label>64.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Watts</surname> <given-names>DJ</given-names></name> <name><surname>Strogatz</surname> <given-names>SH</given-names></name></person-group>. <article-title>Collective dynamics of &#x00027;small-world&#x00027; networks</article-title>. <source>Nature</source>. (<year>1998</year>) <volume>393</volume>:<fpage>440</fpage>&#x02013;<lpage>2</lpage>. <pub-id pub-id-type="doi">10.1038/30918</pub-id><pub-id pub-id-type="pmid">9623998</pub-id></citation></ref>
<ref id="B65">
<label>65.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dada</surname> <given-names>JO</given-names></name> <name><surname>Mendes</surname> <given-names>P</given-names></name></person-group>. <article-title>Multi-scale modelling and simulation in systems biology</article-title>. <source>Integr Biol</source>. (<year>2011</year>) <volume>3</volume>:<fpage>86</fpage>&#x02013;<lpage>96</lpage>. <pub-id pub-id-type="doi">10.1039/c0ib00075b</pub-id><pub-id pub-id-type="pmid">21212881</pub-id></citation></ref>
<ref id="B66">
<label>66.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ghosal</surname> <given-names>S</given-names></name> <name><surname>Moin</surname> <given-names>P</given-names></name></person-group>. <article-title>The basic equations for the large eddy simulation of turbulent flows in complex geometry</article-title>. <source>J Comput Phys</source>. (<year>1995</year>) <volume>118</volume>:<fpage>24</fpage>&#x02013;<lpage>37</lpage>. <pub-id pub-id-type="doi">10.1006/jcph.1995.1077</pub-id></citation></ref>
<ref id="B67">
<label>67.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Germano</surname> <given-names>M</given-names></name> <name><surname>Piomelli</surname> <given-names>U</given-names></name> <name><surname>Moin</surname> <given-names>P</given-names></name> <name><surname>Cabot</surname> <given-names>WH</given-names></name></person-group>. <article-title>A dynamic subgrid-scale eddy viscosity model</article-title>. <source>Phys Fluids</source>. (<year>1991</year>) <volume>3</volume>:<fpage>1760</fpage>. <pub-id pub-id-type="doi">10.1063/1.857955</pub-id></citation></ref>
<ref id="B68">
<label>68.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Freitas</surname> <given-names>D</given-names></name> <name><surname>Guerreiro Lopes</surname> <given-names>L</given-names></name> <name><surname>Morgado-Dias</surname> <given-names>F</given-names></name></person-group>. <article-title>Particle swarm optimisation: a historical review up to the current developments</article-title>. <source>Entropy</source>. (<year>2020</year>) <volume>22</volume>:<fpage>362</fpage>. <pub-id pub-id-type="doi">10.3390/e22030362</pub-id><pub-id pub-id-type="pmid">33286136</pub-id></citation></ref>
<ref id="B69">
<label>69.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hashemi</surname> <given-names>M</given-names></name> <name><surname>Hutt</surname> <given-names>A</given-names></name> <name><surname>Buhry</surname> <given-names>L</given-names></name> <name><surname>Sleigh</surname> <given-names>JW</given-names></name></person-group>. <article-title>Optimal model parameter fit to EEG power spectrum features observed during general anesthesia</article-title>. <source>Neuroinformatics</source>. (<year>2018</year>) <volume>16</volume>:<fpage>231</fpage>&#x02013;<lpage>51</lpage>. <pub-id pub-id-type="doi">10.1007/s12021-018-9369-x</pub-id><pub-id pub-id-type="pmid">29516302</pub-id></citation></ref>
<ref id="B70">
<label>70.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lefebvre</surname> <given-names>J</given-names></name> <name><surname>Hutt</surname> <given-names>A</given-names></name></person-group>. <article-title>Additive noise quenches delay-induced oscillations</article-title>. <source>Europhys Lett</source>. (<year>2013</year>) <volume>102</volume>:<fpage>60003</fpage>. <pub-id pub-id-type="doi">10.1209/0295-5075/102/60003</pub-id></citation></ref>
<ref id="B71">
<label>71.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yuval</surname> <given-names>J</given-names></name> <name><surname>O&#x00027;Gorman</surname> <given-names>PA</given-names></name></person-group>. <article-title>Stable machine-learning parameterization of subgrid processes for climate modeling at a range of resolutions</article-title>. <source>Nat Commun</source>. (<year>2020</year>) <volume>11</volume>:<fpage>3295</fpage>. <pub-id pub-id-type="doi">10.1038/s41467-020-17142-3</pub-id><pub-id pub-id-type="pmid">32620769</pub-id></citation></ref>
</ref-list> 
</back>
</article>