Systems of coupled dynamical units (e.g., oscillators or neurons) are known to exhibit complex, emergent behaviors that may be simplified through coarse-graining: a process in which one discovers coarse variables and derives equations for their evolution. Such coarse-graining procedures often require extensive experience and/or a deep understanding of the system dynamics. In this paper we present a systematic, data-driven approach to discovering “bespoke” coarse variables based on manifold learning algorithms. We illustrate this methodology with the classic Kuramoto phase oscillator model, and demonstrate how our manifold learning technique can successfully identify a coarse variable that is one-to-one with the established Kuramoto order parameter. We then introduce an extension of our coarse-graining methodology which enables us to learn evolution equations for the discovered coarse variables via an artificial neural network architecture templated on numerical time integrators (initial value solvers). This approach allows us to learn accurate approximations of time derivatives of state variables from sparse flow data, and hence discover useful approximate differential equation descriptions of their dynamic behavior. We demonstrate this capability by learning ODEs that agree with the known analytical expression for the Kuramoto order parameter dynamics at the continuum limit. We then show how this approach can also be used to learn the dynamics of coarse variables discovered through our manifold learning methodology. In both of these examples, we compare the results of our neural network based method to typical finite differences complemented with geometric harmonics. Finally, we present a series of computational examples illustrating how a variation of our manifold learning methodology can be used to discover sets of “effective” parameters, reduced parameter combinations, for multi-parameter models with complex coupling. We conclude with a discussion of possible extensions of this approach, including the possibility of obtaining data-driven effective partial differential equations for coarse-grained neuronal network behavior, as illustrated by the synchronization dynamics of Hodgkin–Huxley type neurons with a Chung-Lu network. Thus, we build an integrated suite of tools for obtaining data-driven coarse variables, data-driven effective parameters, and data-driven coarse-grained equations from detailed observations of networks of oscillators.
The resting state fMRI time series appears to have cyclic patterns, which indicates presence of cyclic interactions between different brain regions. Such interactions are not easily captured by pre-established resting state functional connectivity methods including zero-lag correlation, lagged correlation, and dynamic time warping distance. These methods formulate the functional interaction between different brain regions as similar temporal patterns within the time series. To use information related to temporal ordering, cyclicity analysis has been introduced to capture pairwise interactions between multiple time series. In this study, we compared the efficacy of cyclicity analysis with aforementioned similarity-based techniques in representing individual-level and group-level information. Additionally, we investigated how filtering and global signal regression interacted with these techniques. We obtained and analyzed fMRI data from patients with tinnitus and neurotypical controls at two different days, a week apart. For both patient and control groups, we found that the features generated by cyclicity and correlation (zero-lag and lagged) analyses were more reliable than the features generated by dynamic time warping distance in identifying individuals across visits. The reliability of all features, except those generated by dynamic time warping, improved as the global signal was regressed. Nevertheless, removing fluctuations >0.1 Hz deteriorated the reliability of all features. These observations underscore the importance of choosing appropriate preprocessing steps while evaluating different analytical methods in describing resting state functional interactivity. Further, using different machine learning techniques including support vector machines, discriminant analyses, and convolutional neural networks, our results revealed that the manifestation of the group-level information within all features was not sufficient enough to dissociate tinnitus patients from controls with high sensitivity and specificity. This necessitates further investigation regarding the representation of group-level information within different features to better identify tinnitus-related alternation in the functional organization of the brain. Our study adds to the growing body of research on developing diagnostic tools to identify neurological disorders, such as tinnitus, using resting state fMRI data.