Inferring the connectivity of coupled oscillators from time-series statistical similarity analysis

A system composed by interacting dynamical elements can be represented by a network, where the nodes represent the elements that constitute the system, and the links account for their interactions, which arise due to a variety of mechanisms, and which are often unknown. A popular method for inferring the system connectivity (i.e., the set of links among pairs of nodes) is by performing a statistical similarity analysis of the time-series collected from the dynamics of the nodes. Here, by considering two systems of coupled oscillators (Kuramoto phase oscillators and Rössler chaotic electronic oscillators) with known and controllable coupling conditions, we aim at testing the performance of this inference method, by using linear and non linear statistical similarity measures. We find that, under adequate conditions, the network links can be perfectly inferred, i.e., no mistakes are made regarding the presence or absence of links. These conditions for perfect inference require: i) an appropriated choice of the observed variable to be analysed, ii) an appropriated interaction strength, and iii) an adequate thresholding of the similarity matrix. For the dynamical units considered here we find that the linear statistical similarity measure performs, in general, better than the non-linear ones.


Experimental electronic implementation of Rössler oscillators
The experiment consists of twelve electronic circuits, each of them being a piecewise Rössler-like system [1][2][3] operating in the chaotic regime, whose dynamics is given by: . .
where the piecewise part is: x, y and z are the oscillator state variables, K is the coupling strength, g(x) is the piecewise linear function and A ij is the adjacency matrix containing the structure of the network. Here, α = 10 4 s −1 is a time factor, and the other parameters are: Γ = 0.05, β = 0.5, λ = 1, ψ = 1 µ = 15 and υ = b − 0.02. We choose b = 1.66 to have chaotic dynamics in the absence of coupling.
It is possible to translate the Rössler-like equations into the electronic circuit shown in Fig. 1 [4][5][6][7][8], which leads to the following equations of the system: . .
where the piecewise part is: Table 1 summarizes the values of all components used for the contraction of the Rössler-like system. Finally, Fig. 2 corresponds to the electronic circuit introducing diffusive coupling between nodes.
2 Analysis of the distribution of Cross Correlation values for two uncoupled Kuramoto oscillators.
In Fig. 3 of the main manuscript we reported that uncoupled Kuramoto oscillators can display arbitrary large values of cross correlation among them, independently of the length of the time-series. This is doubtlessly a counterintuitive feature, although we can prove that it has a solid mathematical reason. Namely, what we are plotting in Fig. 3 is the maximum value over an ensemble of 120 possible correlations among the time series of uncoupled Kuramoto oscillators. The equations for this system can be written in the form and, since we processes the series removing a linear trend and normalizing respect to the variance, it can be reformulated as an autoregressive process of order 1 (AR(1)) and unitary memory: where is a gaussian innovation with mean 0 and variance 1. The correlation among two independent AR(1) processes can be proven to be equal to the correlation among the first stochastic elements of the two time-series, namely r W = 1 i 1 j [9]. This correlation, of course, cannot be dependent of the length of the time-series since it is relative to the first step. (This explains why we detect an independence of the maximum value respect to the integration time. The reason why the maximum can reach very high values can also be understood: the distribution of the AR(1) correlation can be transformed into a normal one with the so-called Fisher's transformation: Z = tanh −1 (r W ). This transformation is monotonous, so the maximum values of Z correspond univocally to the maximum values of r W . Now, as stated in the caption of the figure, we are considering the maximum over k = 120 independent realizations, that is the maximum over 120 independent values of r W i.e. of Z. We call this variable x = max(Z). The cumulative distribution of such a maximum, F (x) k , can be proven to be given by the following relation: F k = [F 1 ] k that, in our case, is the k-th power of the error function plus a constant. In fact, the probability that the maximum over k independent values is less than a certain x is equivalent to the probability that all the k values are below x, for the definition of maximum; this simply means It has to be noted that for k sufficiently large the distribution F k converges to a limit function referred as General Extreme Value (GEV) distribution, that is the analogous of the normal distribution of the central limit theorem for extreme values, i. e., the convergence of the x PDF to this function is guaranteed independently of the form of the Z PDF. Now that we have the cumulative distribution of the variable x = max(Z) it is possible to compute its expectation as This integral can be numerically estimated as m ∼ 2.57 for k = 120 that lead, with an inverse Fisher's transformation, to the expected maximum correlation of r m ∼ 0.988. Estimating also the variance give use the one sigma interval [0.973; 0.995] for the expected value r m , corresponding to a ∼ 70% confidence interval.
Regarding the Y-projection case, instead, we cannot say much, since the nonlinear transformation to the AR(1) processes impel to compute neither the correlation explicitly nor the moments of the distribution via the generatrix function in a simple way.
For the frequencies-CC maximum values, instead, we have a decrease of the maximum with increasing length of the time-series, T , as 1/T . This can be easily explained assuming that the maximum values scale with the length of the time-series together with the variance, since the distribution moments of the correlation are computable using the associated moment-generating function, and it can be seen that the variance indeed scales as 1/T . In fact, from equation 9 it is possible to see that the cross correlation for two frequencies time-series can be written as We call the product of two normal independent variable z t = t i t j . In can be proven that this variable follows a certain distribution with an associated characteristic function φ z (u) = (1+u 2 ) −1/2 . Thus, being z t independent variables, the characteristic function of CC ij can be written as Deriving twice and evaluating for u = 0 we obtain the variance of CC, namely Var[CC] = 1 T .
(13) Figure 1. Electronic implementation of a Rössler-like electronic circuit. The values of the parameters of the electronic components are summarized in Tab. 1. The term (x j − x i ) accounts for the diffusive coupling between units, and the corresponding electronic circuit is shown in Fig. 2. Figure 2. Electronic implementation of the diffusive coupling between a Rössler-like system and all of its n neighbours. Each branch of the circuit accounts for the difference between oscillator i and j, being a total of n branches. Finally a voltage adder joins the output of each branch.