Phase synchronization and measure of criticality in a network of neural mass models

Synchronization has an important role in neural networks dynamics that is mostly accompanied by cognitive activities such as memory, learning, and perception. These activities arise from collective neural behaviors and are not totally understood yet. This paper aims to investigate a cortical model from this perspective. Historically, epilepsy has been regarded as a functional brain disorder associated with excessive synchronization of large neural populations. Epilepsy is believed to arise as a result of complex interactions between neural networks characterized by dynamic synchronization. In this paper, we investigated a network of neural populations in a way the dynamics of each node corresponded to the Jansen–Rit neural mass model. First, we study a one-column Jansen–Rit neural mass model for four different input levels. Then, we considered a Watts–Strogatz network of Jansen–Rit oscillators. We observed an epileptic activity in the weak input level. The network is considered to change various parameters. The detailed results including the mean time series, phase spaces, and power spectrum revealed a wide range of different behaviors such as epilepsy, healthy, and a transition between synchrony and asynchrony states. In some points of coupling coefficients, there is an abrupt change in the order parameters. Since the critical state is a dynamic candidate for healthy brains, we considered some measures of criticality and investigated them at these points. According to our study, some markers of criticality can occur at these points, while others may not. This occurrence is a result of the nature of the specific order parameter selected to observe these markers. In fact, The definition of a proper order parameter is key and must be defined properly. Our view is that the critical points exhibit clear characteristics and invariance of scale, instead of some types of markers. As a result, these phase transition points are not critical as they show no evidence of scaling invariance.

Jansen-Rit model. In this paper, neural masses are defined as columns. The model consists of pyramidal neurons (main population) that receive inputs from three resources: excitatory and inhibitory interneurons feedback in the same column and an external input from other columns. Figure 1 is a schematic representation of this model.
We consider a network with an interactive population, representing a patch of the cortex. Each node indicates a neural mass with Jansen-Rit dynamics. Nodes can interact with each other and generate different behaviors. First, we represent this dynamic per node k, k =1,..., N. The equations are written by three second-order differential equations and can rewrite with six first-order differential equations as follow 39  www.nature.com/scientificreports/ where (y0, y3) , (y1, y4) and (y2, y5) are activities of pyramidal, excitatory and inhibitory ensembles respectively. S is a sigmoid function, transforms the average membrane potential of neurons to the mean firing rate of action potentials, and is identified as follows: p(s −1 ) means an external input, which can be considered as noise or input from other columns. It used by Jansen and Rit was uniformly distributed noise ranging from 120 to 320 pulses per second 38 .

Simulation. A Watts-Strogatz network (with a rewiring probability of 0.2) is considered in this simulation
with 80 oscillators as nodes. Each node is connected to 20 neighbors, ten on each side. Nodes are represented by using Jansen-Rit's neural mass model. We use the stochastic Runge-Kutta method to simulate the system 78 . The time step is set to 10 −4 s. The coupling coefficient between neurons in each mass is considered 135 (to show alpha rhythm in every single node). Each node receives external inputs from other brain regions. First, we suppose that the coupling coefficient between nodes is equal to zero, and then we start to increase it. The external input used by Jansen and Rit is a uniformly distributed noise ranging from 120 to 320 pulses per second 38 . We divide this range into four segments. We analyze the network dynamics based on the permitted input intervals for four different inputs: weak, intermediate, strong, and ultra-strong. The input is Gaussian random noise with a mean of 145, 195, 245, and 295 in four intensity levels respectively, and sigma = 3.25 for all of them. The length of this simulation is 200 s and runs 20 times on each coupling coefficient. Elimination of the first 5 s of simulation ensure us that the system has achieved its equilibrium.

Results and discussion
Synchronization phase transition. We start with a single node that describes the Jansen-Rit model.
It is investigated that how the system responds to continuous changes in external input set as deterministic parameters. Note that the external input has been varied in four ranges and regarded as stochastic in this paper. The mean of the output model and its power spectrum for four input levels is shown in Fig. 2. Increased input range can increase the level of activity. In each intensity of input in this figure (A-D), this model produces an oscillatory behavior. Increasing the input increases the dominant frequency, but it is still in the alpha band (E). Then, we consider a network with an interactive population, representing a patch of the cortex. Each node shows a neural mass with Jansen-Rit dynamics. A number of factors affect network dynamics. One of the most important factors is coupling strength. Nodes can interact with each other and generate different dynamics behaviors. Changing the coupling coefficient between nodes and analysis of network dynamics has received interest recently 51,54,55 .
In the beginning, we assume that each node receives the external input from a weak range. Cross-correlation is a popular method to investigate and compare time series. By considering this time series of each node in the model and calculating the Pearson cross-correlation (Pcc) between them, a square matrix is created. The ij'th element of the matrix shows the Pcc of node i and j dynamics. The mean of the Pcc matrix for weak input is shown in Fig. 3. Vertical red bars represent dispersions for 20 runs (standard deviations).
A zero cross-correlation results from no coupling. As shown in Fig. 3 increasing coupling up to 1.5 causes more correlation. In addition, the mean signal amplitude gets larger (first column in Fig. 4). Their phase spaces are shown in the second column of the Fig. 4. The x-axis and y-axis show the output signal ( y 1 (t) − y 2 (t) ) and its derivative ( y 4 (t) − y 5 (t) ), respectively. In this range of coupling strength, the dominant frequency is still in the alpha band (third column in Fig. 4).
At a coupling coefficient equal to 1.5, the network's behavior abruptly has been changed, and as it can be seen in Fig. 5 (first column), the large amplitude oscillations appear. In each run, switching from high to low amplitude oscillations occurs at different times. In addition, its phase space is shown in Fig. 5 (second column). It is obvious that in this coupling coefficient, the phase space is composed of 2 limit cycles. The first cycle in respect of high amplitude oscillations shows rhythmic spiking activity, and the second corresponding to a shorter amplitude www.nature.com/scientificreports/ signal. High amplitude signals are rhythmic spiking activities represented in theta band frequency. On the other hand, the dominant frequency of the short amplitude regime is alpha. Indeed, this behavior is a mixed thetaalpha activity. The power spectrum for the value of α = 1.5 shows a significant change in their appearance (the third column in Fig. 5), which confirms a disorder in network function [79][80][81] .
Results indicated that in the [1.5, 3] regime of coupling strength, the output signal for all repetitions shows only two types of behavior: high amplitude oscillations that either change to low amplitude oscillations or not (first column in Fig. 5). We call this area a high synchrony regime, and the coupling coefficient equal to 1.5 (3) is the starting (ending) point of unusual behavior.
Based on Fig. 3, a minor increase in coupling coefficient of more than 1.5 causes a dramatic rise in the mean of the Pcc, and, consequently, the network is brought into the maximum synchronization. Besides, it can be seen that there is a big dispersion of the mean of the Pcc matrix at bifurcation points (1.5 and 3). In fact, at these points, the network dynamics changes dramatically. A coupling coefficient of 1.5 and 3 yields a large variety in the mean of the Pcc, accompanied by a large variance. Cross-correlation matrices in the value of 3 are shown in Fig. 6. These matrices have different patterns. The red (blue) color shows full synchronization (anti-synchronization).  www.nature.com/scientificreports/ Also, local and global synchronization can be seen in this figure. This issue is specified with red and blue masses, respectively. During local synchronization, some ensembles of neurons have the same behavior, while others act completely differently, i.e., synchronized clusters. To put it simply, full red Pearson cross-correlation matrices mean that the network shows a global synchrony state, and the simultaneous presence of blue and red masses in a matrix is a symbol of in-phase and anti-phase local synchronization. This different pattern could be reminiscent of different synchronization patterns of brain networks during different tasks 54 . Again, we tracked the increase in coupling strength. A decrease in the mean of the Pcc has been observed up to 3.5. Figure 7 displays that the mean signal output goes back to the oscillatory state.
As α increases from 3.5 to 5, the Pcc grows to its maximum value. The synchronization behavior switches to a fixed-point state if α is increased further, i.e., each node leaves the limit cycle and collapses into a resting state (fixed-point). Mean signals and their phase space for one repeat are shown in Figs. 7 and 8, respectively.
Moreover, their power spectrum for α = 3.5 , 4, 4.5, and 5 in Fig. 9A shows a switch from disorder to the normal state, and the dominant frequency of the network goes into an alpha rhythm. Indeed, in these coefficients, the mean signal output goes back to the oscillatory state. In addition, the theta band frequency changes to the alpha band frequency. The alpha rhythm does not necessarily indicate the healthy state (the average potential value is increased). We just concluded that the system leaves the disorder state. It may enter any pathologic condition. A close look at Fig. 9B shows the dominant frequency based on the different coupling coefficients. The color of each cell represents the dominant frequency of an independent in-silico experiment (run). The vertical axis corresponds to the coupling strength value of each run in its raw. There is a jump phenomenon in frequency in the [1.5-3] regime that is a crucial feature to diagnose the brain disorder 79,82,83 . It is clear that the coupling coefficient and dominant frequency have an inverse relationship. Also, presence of two states in frequency at α = 1.5 and 3 is an exciting result.
So far, the amount of external input to each node has been considered from a weak level. Next, we look at the input values in the other three ranges (medium, strong, and ultra-strong) and go over the results. . Accordingly, one of these states is guaranteed to appear in every repetition of 1.5 < α < 3 (regardless of alpha). Two independent runs of α = 1.5 are presented here. High amplitude oscillations appear in this regime that either change to low amplitude oscillations or not. Their phase spaces are composed of 2 limit cycles or just one limit cycle (second column). The inset of the time series in the first column shows a specific interval of signal for 3 s. The power spectrum for the values of α = 1.5 shows a significant change in their appearance (the third column). Theta is the dominant frequency.  . The inset of signals shows a specific segment of signal for 3 s. In α = 5.5, each node leaves the limit cycle and take in resting state (fixed-point). www.nature.com/scientificreports/ A low correlation between units is observed for all input intensities for zero couplings. In this case, oscillators can not have a significant effect on each other. It is obvious that lower intensity inputs require a higher coupling coefficient to reach their maximum Pcc. Every unit receives some inputs from its neighbors plus the stochastic amount chosen from an interval. Strengthening coupling between nodes leads to receiving more from neighbors, and as a result, less stochastic value is needed. Indeed, the input is a fundamental element, and its variations can affect the amount of correlation and the rate of the upward trend to reach the highest degree of A remarkable event is seen in Fig. 11: a jump frequency in the weak level input that is related to the highfrequency regime and has been confirmed in 79 . Additionally, in each input level (except the weak level), increasing coupling coefficients leads to a decrease in the dominant frequency, while remaining within the alpha band.

Measure of criticality.
It is claimed that the healthy brain acts in a critical regime [64][65][66][67] . It is an interesting and challenging question that how critical dynamics can be detected in neural models. There are some measures of criticality that can be tested.
In order for any marker of criticality (observed in the brain) to exist, first a critical point or a critical region (Griffiths phase) needs to be determined. We have investigated several points between 1. 25 Figure 12A    www.nature.com/scientificreports/ against the control parameter (coupling coefficient). Two peaks in the coefficient of variation during the continuous phase transition (in α = 1.5 , 3) are the first marker of criticality (Fig. 12B) 57 . These points of maximum of this curve correspond to the value of the transition points in Fig. 12A. The expression of the criticality hypothesis is that the brain might work at the edge of a phase transition 57 . Our research considers synchronization phase transition, so we studied α = 1.5 , 3 candidate points that can be validated as critical. On the edge of synchronization, a different range of patterns can be generated, as shown in 84 . A heatmap of the spatial-temporal matrix during the last 2 s in α = 1.25 , 1.5, 1.75, 2.75, 3 and 3.25 is shown in Fig. 13. In α = 1.25 , the fluctuations are random, and the value of synchronization is low. Conversely, in α = 1.75 , the activity of the network is extremely ordered. This pattern is related to full synchronization that does not show a healthy state. Within this range, in α = 1.5 , the network activity is between high and low synchrony. Similarly, in α = 2.75 and 3.25, activities are highly ordered and disordered, respectively, and in α = 1.5 and 3, the network shows a pattern between ordered and disordered states. So, it is possible to say that these points (1.5 and 3) have a marker of criticality.
At critical points, the perturbations grow in magnitude, and different behavior occurs. Since the system stability is weak, it is expected to see events in each scale (micro or macro) that it is one of the most remarkable properties in criticality. Broadly speaking, the border between scales is not clear. Dysfunction of the cortex is Figure 11. Dominant frequency as a function of the coupling coefficient. An increase coupling coefficient decreases the dominant frequency. One remarkable event is viewed: a jump frequency in the weak level input that is related to the high-frequency regime. www.nature.com/scientificreports/ associated with either abnormally low or high synchrony. Moderate synchrony is characteristic of a healthy cortex. In some cases, extreme variability in synchrony is unavoidable if the cortex must operate with moderate mean synchrony. At critical points, synchrony has a medium mean and high variability 85,86 . Due to this, both α = 1.5 and 3 have a criticality marker as they satisfy this criterion (based on Fig. 3) 57,68,87,88 . The presence of LRTCs in the amplitude of neural oscillations supports the critical hypothesis. Indeed, longrange temporal correlations are a vital feature of criticality 89,90 .
The temporal correlation structures of the signal are investigated by the mean auto-correlation or DFA method. A signal can show LRTCs if its auto-correlation decays as a power law (with an exponent between −1 and 0). Generally, auto-correlation functions are very noisy in their tail, and so, the exponent estimation is very complicated. DFA is a proper technique that overcomes these problems 90 . This technique in neural mass models is used recently in 51 .
According to the previous section, it seems that one phase transition occurs in the weak input level around α = 1.5, 3 . We checked the presence of LRTCs at these points in an arbitrary repetition. In Fig. 14 the DFA is applied in an absolute of signal Hilbert transformation with no overlapping. The values on the x-axis are in seconds on logarithmic scales based on segment sizes ( L seg = 2N s and the definition of N s can be found in the "Methods" section). The y-axis shows the standard deviation mean of all sized segments (F(s) is relevant to the given definition of the "Methods" section). The linear fit is not suitable for data (on the report of 91 spline fit is the best fit model). So, LRTCs do not exist at these points. Similarly, LRTCs have not been exhibited in other coupling coefficients, and consequently, the network with defined parameters does not show this feature.
Link to seizure. Seizure is characterized by complex dynamics, and it is a result of a high level of neuronal synchrony. The transition between high and low amplitude signals is one of the most indicators of brain disorders such as epilepsy 79,92,93 . A generalized spike-wave discharge (GSWD) occurs at α = 1.5 with a frequency band of approximately 4 Hz. Commonly, GSWDs have arisen after paroxysmal oscillations in the corticothalamic networks but actually, their process is not clear yet 94,95 . Signals with a high amplitude correspond to rhythmic spikes in theta band frequency. These can indicate seizures, which are confirmed by seizure frequencies 92 . Indeed, the onset of this disorder can be viewed as α = 1.5 . Regarding coupling strengths between α = 1.5 and α = 3 , there are two possible behaviors (Fig. 5). Short amplitude signals have presented the alpha rhythm. In mixed seizurealpha behavior, ictal activity may occur 61 . Importantly, the length of this simulation is 200 s. Indeed, Here, there is seizure activity that lasts up to 200 s or changes to a low amplitude oscillatory state. Consequently, we call this area a seizure regime and the coupling coefficient equal to 1.5 (3) is the starting (ending) point of seizure. Insightful to note that different frequency bands have a remarkable role in seizure 96,97 . In α = 1.5 , we discovered www.nature.com/scientificreports/ a transition from preictal to ictal state that is recognized by rhythmic spikes in a theta-band frequency and triggers the seizure activity. It is noteworthy the frequency jump is a vital issue to comprehension seizure.

Conclusion
Neural mass models are widely used to simulate the activity of populations of neurons. In this work, we first explained the one-column Jansen-Rit neural mass model. Next, we constructed a small-world network consisting of identical oscillators that their dynamics corresponded to this model. Our results showed that in a weak input level, the dominant frequency of a single unit would be different from a network of units.
Spiegler et al. 58 demonstrated that the dominant rhythm could be scaled by the ratio between the inhibitory and excitatory time constants. Our results suggested that even though that the alpha rhythm was prominent in each node, the network comprised of single nodes did not necessarily show this rhythm. This consequence is an emergent property in a complex system. i.e., every isolated node represents an alpha rhythm, which changes when these units interact with each other in a complex system 98 .
Synchronization phenomena have a significant impact on how the brain functions normally and abnormally. Nonlinear dynamics is one of the most fundamental phenomena in phase synchronization. The input level and coupling coefficient are crucial parameters in system synchronization, as we explained in detail. In our research, an unusual event was detected in the weak input range, including a high synchrony activity related to the first peak of the Pcc matrix. α = 1.5 and 3 were candidates as bifurcation points, and [1.5-3] was defined as a seizure area. We observed GSWDs with approximately 4 Hz frequency band in α = 1.5 . It is a vital fact to note that different frequency bands can have significant effects on epilepsy 96 . A transition from a preictal to an ictal state has been observed in α = 1.5 . A rhythmic spike in theta band frequency marks the beginning of this transition, which triggers seizure activity. Understanding epilepsy involves understanding frequency jump, which is a crucial factor 97 .
Thus far, all of these results have been received from the considering of the weak input. Although the variance of the mean of Pcc was high in α = 1.5 and 3 for intermediate, strong, and ultra-strong input levels, coupling strength and input levels were not able to properly produce all behaviors at the weak input level. So no bifurcation and phase transition exists in these cases.
In the final section, we presented some markers of criticality and investigated them at bifurcation points. Criticality is an asserted assumption in a healthy brain. From a theoretical perspective, a system is poised at the critical point or not. However, previous resting-state fMRI studies have shown that the brain spends most of its time wandering around a broad region near a critical point, rather than sitting at it 99 . Indeed, there is a whole extended region around the critical point where cortical networks operate. Due to the complex hierarchicalmodular structure of cortical networks, a critical point in the brain can extend to a critical-like area that corresponds to a Griffiths phase in statistical mechanics 100 .
According to our study, some markers of criticality can occur at phase transition points, while others may not. This occurrence is a result of the nature of the specific order parameter selected to observe these markers. In fact, The definition of a proper order parameter is crucial and must be defined properly. Our view is that the critical points exhibit clear characteristics and invariance of scale, instead of some types of markers. As a result, α = 1.5 , 3 are not critical as they show no evidence of scaling invariance. The definition of new measures or observables where scale-invariance can be appreciated can be a challenging question. One different thing is that empirical measures in the actual brain can exhibit some markers of criticality. Also, it is an open question to be solved where the brain operates (if it works in a critical point/region or not) and the type of phase transition it exhibits. www.nature.com/scientificreports/ In summary, the randomness of input, initial conditions, noise and coupling strengths in a network can generate different complex behaviors. We also should mention that these behaviors are not just grounds to trigger a seizure activity because understanding the seizure dynamics can not be achieved easily. There are some research reports about the delay and noise in coupled oscillators 51,[101][102][103][104][105] . The delay between neural populations can produce interesting and complex dynamics in networks; however, we assumed that in this work, they were zero. Notice, this model is analyzed in the alpha frequency band, i.e., the coupling between excitatory and inhibitory masses in each node was set to 135 in this model. Further studies can focus on the epilepsy state, and their results may be helpful in the detection and treatment of this disease. It is possible that adjusting the standard deviation of stochastic input would have a substantial effect that was discarded here. It may have some important implications if this is considered.
We are left with the question, how the Jansen-Rit model can be modified to show more markers of critical behavior (especially scale-invariant) and, if so, which parameter adjustments (noise, delay, coupling between inhibitory ensembles, etc.) are necessary.

Methods
Graph theory. Modeling the brain as a complex network is a powerful mathematical tool to understand the structural and functional of the brain architecture. Structural brain networks can be shown as graphs that their nodes (vertices) are related to neural elements (neurons or brain regions) and linked by physical connections (edges). Using adjacency matrix (M) is a simple method to display a matrix where M ij = 1 , if there is a connection between node i and node j and Mij = 0 otherwise. In this work, neural populations are considered as nodes, and edges are interpopulation connections.
The clustering coefficient computes how connected a node's neighbors join together to make a cluster. This coefficient for each unit i is identified as follows: where E i is the number of edges between the neighbors of i and k i shows the i degree. This feature in a network is defined as the average clustering coefficient of units.
The shortest path length is the least number of edges between nodes: where V is the set of nodes and N shows the number of nodes in a netwrk. Also, d(i, j) is the shortest path from node i to node j. The average of this property shows the average node to node distance in a graph. This concept represents how rapidly information can be transformed through the network. Regular, random, and small-world networks are three important network models which have largely been studied so far. In regular networks, each node has exactly the same number of connections and the clustering coefficient is high in these networks. The connections between each node in random topology follow a normal degree distribution with a low level of clustering. The average path length is short (long) in regular (random) systems.
Evidence suggested that the most real-world networks have small-world properties including two independent structural features, namely, short average path length (long-range connections) and high clustering coefficient 71,106-108 . The Watts-Strogatz (WS) models were generated as the simplest networks that have the smallworld properties. There are N nodes in a regular ring lattice in the WS network, each connected to its nearest neighbors along with a range of k, k 2 on either side. With probability p, each edge is rewired to a new node. In the case of p = 0 and p = 1, regular and random networks are produced, respectively.
Pearson cross-correlation. Neural interactions can be measured by multiple criteria, each with its advantages and disadvantages 109 . The simplest measure of non-directed interactions between random variables is Pcc which measures the linear relationship between each pair of random variables. This coefficient removes the temporal structure, so time series are considered as generalizations of random variables. Normalized Pcc coefficient in one-dimensional between two time-series x and y is defined as follows: where N shows the length of the signal and < x > is the mean of time series x. The range value of this coefficient varies between −1 and 1. The quantity of 1(−1) represents a perfect linear positive (negative) correlation and 0 means no correlation between two series (uncorrelated state). Note that the matrix constructed by this definition is symmetric and elements are one in the main diagonal.
Detrended fluctuation analysis (DFA). DFA method specifies the self-affinity of a signal and demonstrates long-range correlation in time series, which was first introduced by Peng et al. 110 . Illustration of this procedure consists of the following five steps.
Scientific Reports | (2022) 12:1319 | https://doi.org/10.1038/s41598-022-05285-w www.nature.com/scientificreports/ • Step 1: We consider time-series x with a length of N and calculate the cumulative sum of x− < x > where < x > is the mean signal of x: where t ∈ N. • Step 2: Y t is divided into N s = (N/s) segments with the same size s. It is possible that (N/s) not to be an integer, so we do this division from the end of signals. Therefore, we have 2N s segments. Then, the least-squares fit is calculated by minimizing the squared errors in each segment. • Step 3: In each segment, the root mean square (RMS) for Y is calculated as follow: which Y local shows the local trend.
• Step 4: We repeat step 3 for different interval sizes ranging between 4 and N/10 to find the relation between the scaling exponent and the data fluctuation. • Step 5: We plot F(s) consistent with s in a logarithmic scale. In power-law relation, F(s) ∼ s a which a, fluctuation or scaling exponent, is determined as the slope of a straight line fit. 0.5 < a < 1 represents long-range correlation.