Electrical and chemical synapses shape the dynamics of neural networks, and their functional roles in information processing have been a longstanding question in neurobiology. In this paper, we investigate the role of synapses on the optimization of the phenomenon of self-induced stochastic resonance in a delayed multiplex neural network by using analytical and numerical methods. We consider a two-layer multiplex network in which, at the intra-layer level, neurons are coupled either by electrical synapses or by inhibitory chemical synapses. For each isolated layer, computations indicate that weaker electrical and chemical synaptic couplings are better optimizers of self-induced stochastic resonance. In addition, regardless of the synaptic strengths, shorter electrical synaptic delays are found to be better optimizers of the phenomenon than shorter chemical synaptic delays, while longer chemical synaptic delays are better optimizers than longer electrical synaptic delays; in both cases, the poorer optimizers are, in fact, worst. It is found that electrical, inhibitory, or excitatory chemical multiplexing of the two layers having only electrical synapses at the intra-layer levels can each optimize the phenomenon. Additionally, only excitatory chemical multiplexing of the two layers having only inhibitory chemical synapses at the intra-layer levels can optimize the phenomenon. These results may guide experiments aimed at establishing or confirming to the mechanism of self-induced stochastic resonance in networks of artificial neural circuits as well as in real biological neural networks.
All vertebrate brains contain a dense matrix of thin fibers that release serotonin (5-hydroxytryptamine), a neurotransmitter that modulates a wide range of neural, glial, and vascular processes. Perturbations in the density of this matrix have been associated with a number of mental disorders, including autism and depression, but its self-organization and plasticity remain poorly understood. We introduce a model based on reflected Fractional Brownian Motion (FBM), a rigorously defined stochastic process, and show that it recapitulates some key features of regional serotonergic fiber densities. Specifically, we use supercomputing simulations to model fibers as FBM-paths in two-dimensional brain-like domains and demonstrate that the resultant steady state distributions approximate the fiber distributions in physical brain sections immunostained for the serotonin transporter (a marker for serotonergic axons in the adult brain). We suggest that this framework can support predictive descriptions and manipulations of the serotonergic matrix and that it can be further extended to incorporate the detailed physical properties of the fibers and their environment.
The ventral visual stream (VVS) is a fundamental pathway involved in visual object identification and recognition. In this work, we present a hypothesis of a sequence of computations performed by the VVS during object recognition. The operations performed by the inferior temporal (IT) cortex are represented as not being akin to a neural-network, but rather in-line with a dynamic inference instantiation of the untangling notion. The presentation draws upon a technique for dynamic maximum a posteriori probability (MAP) sequence estimation based on the Viterbi algorithm. Simulation results are presented to show that the decoding portion of the architecture that is associated with the IT can effectively untangle object identity when presented with synthetic data. More importantly, we take a step forward in visual neuroscience by presenting a framework for an inference-based approach that is biologically inspired via attributes implicated in primate object recognition. The analysis will provide insight in explaining the exceptional proficiency of the VVS.
Resting-state brain activities have been extensively investigated to understand the macro-scale network architecture of the human brain using non-invasive imaging methods such as fMRI, EEG, and MEG. Previous studies revealed a mechanistic origin of resting-state networks (RSNs) using the connectome dynamics modeling approach, where the neural mass dynamics model constrained by the structural connectivity is simulated to replicate the resting-state networks measured with fMRI and/or fast synchronization transitions with EEG/MEG. However, there is still little understanding of the relationship between the slow fluctuations measured with fMRI and the fast synchronization transitions with EEG/MEG. In this study, as a first step toward evaluating experimental evidence of resting state activity at two different time scales but in a unified way, we investigate connectome dynamics models that simultaneously explain resting-state functional connectivity (rsFC) and EEG microstates. Here, we introduce empirical rsFC and microstates as evaluation criteria of simulated neuronal dynamics obtained by the Larter-Breakspear model in one cortical region connected with those in other cortical regions based on structural connectivity. We optimized the global coupling strength and the local gain parameter (variance of the excitatory and inhibitory threshold) of the simulated neuronal dynamics by fitting both rsFC and microstate spatial patterns to those of experimental ones. As a result, we found that simulated neuronal dynamics in a narrow optimal parameter range simultaneously reproduced empirical rsFC and microstates. Two parameter groups had different inter-regional interdependence. One type of dynamics was synchronized across the whole brain region, and the other type was synchronized between brain regions with strong structural connectivity. In other words, both fast synchronization transitions and slow BOLD fluctuation changed based on structural connectivity in the two parameter groups. Empirical microstates were similar to simulated microstates in the two parameter groups. Thus, fast synchronization transitions correlated with slow BOLD fluctuation based on structural connectivity yielded characteristics of microstates. Our results demonstrate that a bottom-up approach, which extends the single neuronal dynamics model based on empirical observations into a neural mass dynamics model and integrates structural connectivity, effectively reveals both macroscopic fast, and slow resting-state network dynamics.