Abstract
This research illustrates that complex dynamics of gene products enable the creation of any prescribed cellular differentiation patterns. These complex dynamics can take the form of chaotic, stochastic, or noisy chaotic dynamics. Based on this outcome and previous research, it is established that a generic open chemical reactor can generate an exceptionally large number of different cellular patterns. The mechanism of pattern generation is robust under perturbations and it is based on a combination of Turing’s machines, Turing instability and L. Wolpert’s gradients. These results can help us to explain the formidable adaptive capacities of biochemical systems.
Similar content being viewed by others
Introduction
More than 50 years ago, Lewis Wolpert^{1} proposed the positional information (PI) model to describe patterns of different cell types. Seventy years ago, Alan Turing introduced the idea of patterns originating from homogeneous states by reactiondiffusion mechanism (RDM)^{2}. In both conceptual models, an organism is represented as a pattern consisting of different cells. Both Turing’s and Wolpert’s approaches assume that morphogens, special reagents, can change cell states. Modern research has shown that the French flag and reactiondiffusion models do not capture all details of real biological patterning systems^{3}, and the relationship between them remains unclear although some ideas are proposed by Green and Sharpe^{4}.
In this paper, inspired by the ideas of Brenner^{5} and Jacob about evolutionary tinkering^{6}, we show that a combination of RDM and PI mechanism gives a universal robust pattern mechanism. Brenner emphasized the importance of Turing’s ideas in the context of his famous machine. A Turing machine is a mathematical model of computation that defines an abstract machine that manipulates symbols on a tape according to some rules. The machine stops when it arrives at a prescribed terminal state. In its turn, Jacob introduced the concept of tinkering (evolution uses all available means^{6}).
We show the dynamics of a generic open chemical reactor is capable to generate an exponentially large number of different attractors, which may be chaotic, periodic, or steady states. The chaotic or periodic attractors can occur if our reactiondiffusion model is not gradientlike and not monotone. The results hold under certain mathematical conditions, which admit a transparent chemical interpretation for twocomponent systems: one of the reagents is neither an activator nor an inhibitor for the other reagent, reagent diffusion coefficients are sharply different and the space dimension of the system \(\ge 2\). The second result is that such open chemical systems can be used as a sort of Turing machine (TM), which can produce an exponentially large number of different cellular differentiation programs (CDPs, see Fig. 1). CDP is a program, which, by morphogene concentrations, produces cell patterns unfolding in time and space, for example, gastrulation patterns, or somitogenesis periodical structures^{7,8}. The number of possible CDPs increases exponentially in the system size.
The proposed cell pattern formation mechanism is robust against fluctuations and perturbations. So, a combination of two of Turing’s basic ideas (machines and instability in chemical systems) leads to general results, which sharply extend the Turing instability^{2} model. The combined TuringWolpert model makes it possible to obtain evolving over time patterns that are much more complex than zebra stripes.
The mechanism of attractor and pattern generation can be sketched as follows. One can show (see “Methods”, and the work^{9}) that a generic open chemical system is capable to generate a number of chaotic or periodic attractors if it involves appropriate spatial heterogeneities. These inhomogeneities work as activators and excite the formation of sharply localized segments like to Drosophila segmentation. Segments correspond to local maxima of slow reagent concentration. These segments interact in a longrange manner via the fast diffusing component, all these mechanisms remind ones described in the manuscript^{10} but we are capable to prove that they can lead to chaos. The chaotic dynamics allows us to generate complicated cell patterns in space and in time that can be shown by the results from these works^{11,12,13,14} (see Fig. 2).
According to Meinhardt^{10}, to generate complex patterns, a reactiondiffusion mechanism should include local selfactivation and longrange inhibition. The new mechanism, proposed in this paper, which is capable to create a number of complicated local attractors, is more sophisticated. It also uses a local activation leading to layered patterns for the slow diffusing component but instead of longrange inhibition, we have a more complicated interaction structure. In some space domains, the slow reagent serves as an activator while in other ones it works as an inhibitor. Such a mosaic (mixed) structure of activation and inhibition arises due to spatial gradients. A welladjusted spatial gradient causes the formation of many welllocalized segments, where the slowly diffusing reagent u is concentrated. The reagent u in these local layers interacts with fast diffusing component v, which, in turn, acts on those layers and so we obtain nonlinear nonlocal feedback and complex dynamics. Due to the mosaic structure of activation and inhibition, this mechanism can lead to chaos. Dynamics induced by each group of interacting segments generate different local attractors.
Biological systems are affected by perturbations (including stochastic ones) and the concentrations of gene products fluctuate. Moreover, pattern formation goes under environmental noises and mutations. Nonetheless, we know that the final result of pattern formation is remarkably stable, i.e., the patterning of real organisms is a quite robust (canalized) process. Starting with seminal works^{15,16} a number of works were focused on the canalization and robustness problem, see, for example, the research^{17,18,19,20} and the review^{21}.
By Turing machine (TM) theory, we show that robustness can be provided if the patterning goes into some stages (a multistage process) and for each stage, there is a stop signal, which informs that an intermediate targeted pattern is achieved. Therefore, the Turing machine theory helps to show the formidable ability of biochemical systems to generate different structures and the robustness of this generation.
Note that the pattern and chaos emergence process can be considered as selforganization, and therefore, the Turing machines arise as a result of selforganization. The question of the computation power of selforganization in physical systems remains highly controversial: some authors believe^{22} that it can make hypercomputing (something stronger than Turing machines). This complex question is outside of our scope in this research.
Moreover, we conducted a theoretical analysis and performed simulations using white noise, prioritizing simplicity and utilizing the classical mathematical theory primarily developed for white noise and Wiener processes (see the study^{23} for an excellent statement). In this work, we do not pretend to describe bifurcations between different attractors induced by noises because our method uses structural stability under deterministic and random perturbations^{24,25}.
Results
Model
Let us consider the following reactiondiffusion model with spatial gradients
where \(u=(u_1(x,y,t), u_2(x,y,t), \ldots , u_m(x,y,t))\) are unknown functions of reagent concentrations defined on \(\Omega \times \{ t \ge 0 \}\), \(\eta (x,y)=(\eta _1, \ldots , \eta _m)\) is a vectorvalued function, which can be considered as a spatially heterogeneous source (possibly involving a small spacetemporal noisy term), \(\Omega\) is the strip \((\infty , \infty ) \times (0,1) \subset \mathbf{R^2}\), and \(\textbf{D}=diag(d_1, \ldots , d_m)\) is a diagonal matrix of diffusion coefficients \(d_i >0\). We assume that reaction terms \(f_i(u)\) are smooth, for example, polynomials. We complement (Eq. 1) by standard initial and boundary conditions, for example, at \(y=0,1\) for certain \(u_i\) we set the zero Neumann boundary conditions and for other ones, we set the zero Dirichlet ones. To simplify the problem, we set periodic boundary conditions along x assuming that all the functions \(u_i^0\) that define initial data are \(2\pi\)periodic in x. The initial boundary value problem (IBVP) is defined by Eq. (1), and standard boundary and initial conditions define a family of local semiflows \(S^t_\textbf{P}\). Each semiflow \(S^t_\textbf{P}\) depends on the problem parameters \(\textbf{P}\), which are external gradients \(\eta (x,y)\) and diffusion coefficients \(d_i\)^{9}. As a simple example, one can consider the following twocomponent system:
where \(\eta (x,y), \zeta (x,y)\) are smooth functions, f, g are smooth nonlinearities, for example, polynomials, (x, y) lies in the strip \([\infty , \infty ] \times [0, L_2]\). For mathematical tractability, we consider periodic (or zero Neumann) boundary conditions \(u(x+L_1,y, t)=u(x, y), v(x+L_1,y, t)=v(x, y, t)\), the zero Neumann condition for u at the top and the bottom \(y=0, L_2\) and and the zero Dirichlet conditions for v at \(y=0, L_2\). We also set initial conditions for u, v. We consider degradation coefficients \(\lambda _i \ge 0\), diffusion coefficient \(d_u, D_v\) and sources \(\eta , \zeta\) (which can be interpreted as maternal morphogen gradients) as a parameter \(\textbf{P}\).
Main results
The following assertions stated in a nonformal manner, unwrap the main results. The first one (Claim I) is a corollary of previous results^{9}:
For generic f, g open system (Eqs. 2, 3) enjoys the property of Universal Dynamical Approximation (UDA): these systems can generate all possible finite dimensional structurally stable dynamics when we vary their parameters P . In particular, these systems are capable to generate hyperbolic chaotic dynamics, such as Smale horseshoe, Anosov flow, etc. These chaotic sets can be at any dimension.
You can find more details and formal definitions of UDA in the upcoming section on the methods. Note that the reaction part is fixed and to obtain the needed dynamics, we adjust the diffusion, degradation coefficients and external gradients \(\zeta , \eta\). The words ‘generic open’ means the choice of f, g excludes gradientlike and monotone reactiondiffusion systems. In fact, gradientlike systems have the Lyapunov functions which can be interpreted physically as energy decreasing along trajectories. In general, the existence of such energy leads to convergence to equilibria and rules out stable chaotic or periodic regimes (although an unstable complicated largetime behavior is possible^{26}). In our case the sense of ’genericity’ (the fundamental concept introduced by Thom^{27}) is that if some reaction parts f, g are ’bad’, i.e., our assertion does not hold, then a small smooth correction to f, g transforms these functions into ’good’ ones and for them the assertion is valid. Note that the conditions on f, g can be formulated explicitly^{9}, in particular, it can be checked that these conditions hold for the Brusselator and other models.
For general multicomponent systems, one can show the following (Claim II):
Under some conditions on the reaction part f, the systems defined by Eq. (1) are capable to generate \(2^{M_a}\) different structurally stable local attractors (which may be periodic or even chaotic), where \(M_a > C_f m d_{min}^{1/4}\), \(d_{min}\) denotes the minimal diffusion coefficient, \(C_f>0\) is a constant depending on the nonlinear part f, m the number of components (involved reagents).
We can consider external sources \(\eta (x,y)\) in Eqs. (2) and (3) as spatial concentrations of certain maternal morphogens (for example, Bicoid and Nanos in Drosophila^{28}). To obtain a complicated attractor, we need an appropriate choice of functions \(\eta (x,y)\). We can thus consider those functions as carrying positional information, which helps to create a spatiotemporal pattern. Moreover, we note that in order to have a complex pattern u(x, y, t) we need diffusion coefficients of different orders, i.e., some reagents should diffuse much faster than others. So, our result means that under certain conditions on diffusion coefficients and on the reaction part the positional information can be transformed into a spatiotemporal structure with complex largetime behavior.
In the section on the methods, we show (following the study^{13}) how chaotic attractors can serve as cellular developmental programs (CDPs) and create complicated spatiotemporal cell patterns. Therefore, it is shown that dynamical systems are given by Eq. (1) (which realize combined TruingWolpert mechanism) are capable to realize all possible cell developmental programs (CDPs). The maximal possible number of these programs is not less than \(2^{M_{dvp}}\) with \(M_{dvp} \propto M_a\). So, we can formulate the following nonformal assertion:
Assertion on the superpower of Turing–Wolpert mechanism
Combined Turing–Wolpert mechanism can produce any cell patterns unfolding in space and time and has a formidable capacity: it is capable to produce exponentially many different cell patterns.
Note that CDPs exhibit more complicated cell dynamics than cellular automata with local interaction. In fact, in the CDP case two subsequent intime rows of cells can be arbitrary (see Fig. 1) whereas if they are generated by a cellular automaton with a local interaction, there should be correlations between subsequent rows.
The proof of Claim I can be obtained by results in the paper^{9} and by a simple construction from dynamical systems theory. The proof of Claim II is based on Claim I. Below, we outline the main ideas of these demonstrations. Moreover, in Supplementary Material (SM), we consider twocomponent reactiondiffusion systems with spatial inhomogeneities, which can simulate the known Rössler system exhibiting chaotic behavior. This example and simulations made for its finite difference analog illustrate the main principles of attractor and pattern generation stated below, in particular, the complex spatial structure of activation and inhibition by a slow diffusing component, and the formation of the layers (segments).
Methods
Stochastic Turing machine and cell pattern generation
Let us draw key points of our approach. Really existing organisms can include tens and hundreds of types of differentiated cells. The most effective scheme to handle this variety of patterns is to use a binary tree for cell differentiation. As an example, let us consider an organism consisting of 4 cell types (generalization on larger sets of cell types can be done in a straightforward way). We denote these cell types by gr, wr, gb, and wb. This notation can be explained as follows. Suppose cells r and b arise from nondifferentiated cells acquiring features r and b. Further, at the second stage of differentiation, a differentiated cell can acquire additional features g and w. We encode these cells gr, wr, gb and wb as 00, 01, 10 and 11, respectively. For example, the string \(a=(00, 01, 01, 01, 11, 10, 00, 11, 11)\) corresponds to cell pattern gr, wr, wr, wr, wb, gb, gr, wb, wb. Further, we consider, for simplicity, binary strings consisting of r, b cells.
Dynamical systems and cell patterns
Consider a layered onedimensional (1D) pattern consisting of cells of two types, red and blue. We decompose our ’organism’ in small domains of equal length occupied by cells. Then the pattern of red and blue cells can be considered as a string, for example, a periodic string (rbrbrbrb) (case P), a simple string (rrrrbbb) (case S), or a general nonperiodic one (case G). Suppose the cell type depends on a morphogen \(u \in (0,1)\) level, say, for \(u \in R_1=(0, 1/2)\) we have the rcell, and for \(u \in R_2=(1/2, 1)\) we have the bcell. Then the periodic pattern G can be obtained by a periodical morphogen concentration via the Turing instability, and the simple pattern S can be generated by a monotone morphogen gradient via Wolpert’s mechanism. However, to obtain a complex nonperiodic pattern of a large length we need sophisticated concentration morphogene profiles u(x). It is hard to obtain such ugradients by diffusion from a source. Moreover, we would like to generate any prescribed patterns by a universal and short model. In fact, we know that many genes making morphogenesis are shared between many different organisms. These genes are highly conservative in evolution^{29,30}. On the other hand, we also know that universal TMs are capable to generate any strings in time and they admit a compact description. Moreover, any time strings can be obtained by chaotic dynamical systems^{11,12}, finite automata and Markov chains (see below). So, there are simple dynamic models simulating TMs and generating all possible time cell strings for a few of cell types.
We show how to use stochastic models to generate prescribed time strings. These models have a sufficient number of states (to simulate all types of cells) and function in time up to a stop signal. The stop signal mechanism, borrowed from TM theory, provides canalization (robustness) and regeneration.
Suppose that we would like to obtain a phenotype given a pattern \(u(x,s^*)\) and encoded by a target gene expression string \(s^*=s_1^* \cdots s_{m}^*\), where \(s_i \in S^m=\{0,1\}^m\). At each time moment t we have \(n=2^{m}\) states \(s \in \{0,1\}^{m}\). Let \(w(s, s^{\prime }, \xi )\) be the transition probability from \(s^{\prime }\) to s per one step depending on the environment state \(\xi\). Then stochastic gene expression dynamics can be defined, for example, by the following Markov chain
where \(p_s(t)\) denotes the probability to be in the state s at the moment t. Assume that the terminal state is absorbing, i.e. the process stops when \(s=s^*\) (i.e. \(w(s, s^*)=0\) for all \(s^*\)). This dynamical model includes noisy and chaotic dynamical systems. The main idea is that under fairly general conditions we reach the target state \(s^*\) within a bounded time period \(\tau (s^*)\). This time is a random quantity, however, for many processes \(\tau (s^*)\) can be estimated. This construction describes a realization of a stochastic Turing machine, or more specifically, a random finite automaton, where \(s^*\) is a terminal state. Note that dependence on the environment parameter \(\xi\) can be important for plant development, where morphogenesis depends on environmental cues. The morphogenesis time \(\tau\) is proportional to the number of steps in time to reach the target terminal \(s^*\) state.
This stochastic mechanism is feasible, however, only when the state number n is not too large. We estimated the dependence \(\tau\) on the state number n by numerical simulations for the Markov chains with constant transition probabilities. It is obtained that the average \(E\tau _{\epsilon }\), where the expected value is taken over all absorbing Markov chains, is a linear function of the state number n. The standard deviation of \(\tau _{\epsilon }\) is small for large n with respect to \(E\tau _{\epsilon }\) (see Figs. A7, A8 and A9 in SM).
So, for large n we have a large \(\tau _{\epsilon }(n) \approx const \ n\) and for too large n the stochastic morphogenesis may continue for a long time. We can overcome this difficulty assuming that the long target pattern \(s_1^* \cdots s_n^*\) is split into intermediate target patterns \(s^{*, k}=s_{m_k}^* \cdots s_{m_{k+1}}^*\), where the long string \(s^*\) is a concatenation of shorter strings \(s^{*, k}\), \(k=1, \dots , p\). The index k indicates the kth stage of morphogenesis.
Then the full time \(\tau _{\epsilon }(s^*)\) needed to obtain the target string is the sum \(\tau (s^*)=\sum _{k=1}^{N_m} \tau (s^{*, k})\). If all the processes are independent, and a new process starts just the previous stage is completed, then for large numbers \(N_m\) the total time \(\tau (s^*)\) will be very stable due to classical theorems of probability theory (although the times for separate stages are random and can fluctuate). So, we obtain that the stochastic multistage morphogenesis is a wellcanalized process, which is robust against external perturbations \(\xi\).
Let us consider a simple example: a random search of a target string on mdimensional Boolean hypercube \(S^m\). Let \(s^*\) be a target pattern, which is uniformly randomly chosen as a vertex of \(S^m\). We choose another random vertex \(s^0\) as a starting state. At each step, we move from the current vertex to a neighboring one (that corresponds to a flip of some component \(s_j\)). Our walk ends when we reach the target state. It is clear that for a random initial state, such a search continues \(\tau =O(2^{m/2})\) steps (see Fig. A8, the top right line in SM). Note that the random walks on hypercube are considered in the research^{31}.
However, if the process goes in a few stages then the time \(\tau\) is essentially smaller and the fluctuations of \(\tau\) are smaller. In fact, let us represent a target string \(s^*\) as a concatenation of several strings \(s^{*,1}, ... , s^{*, k}\). Suppose for simplicity that lengths \(l(s^{*, j})\) (of \(s^{*, j}\) are equal: \(l(s^{*, j})=m^{\prime }= m/k\) and the search of \(s^{*}\) goes in a k stages. First we go to \(s^{*,1}\) by a random walk on the hypercube \(S^{m^{\prime }}\) of dimension \(m^{\prime }\). Just the process reaches \(s^{*,1}\) we fix components of \(m^{\prime }=m/k\) of the first components of s. Then we move to \(s^{*,2}\) by a random walk on the second hypercube of dimension \(m^{\prime }\), etc. All this search requires \(O( k 2^{m^{\prime }})\) steps. This means that we have an exponential acceleration by a decomposition (the results of numerical simulations are shown in Fig. A8 in SM). We define the normalized standard deviation \(\sigma (\tau )\) of \(\tau\) as \(\sigma (\tau )=E\tau /\sqrt{Var \ \tau }\), where EX and \(Var \ X\) denotes the expected value and the variation of random quantity X. Then computations show that \(\sigma (\tau ) \approx 1/\sqrt{k}\) and \(\sigma (\tau ) \approx 1\) when the process goes in a single stage. A stochastic dynamics defined by Eq. (4) can be used to make morphogenesis. The key idea is as follows. Each pattern is encoded by a genotype s. We suppose that the pattern corresponding to a viable organism (obtained by an evolution) is encoded by \(s^*\). Then we use \(s^*\) as a terminal (absorbing) state in the dynamics defined by the Master equation (4). In the next subsection, we consider the realization of dynamics (Eq. 4) and TMs via reactiondiffusion models.
Universal dynamical approximation
Let us outline the concept of Universal Dynamical Approximation (UDA) (that term is inspired by the work^{32}). To explain this concept, remind that many neural networks such as multilayered perceptrons (MLP) enjoy the property of universal approximation: for each prescribed sufficiently smooth output function f defined on a compact domain \(D \subset \textbf{R}^n\) of a Euclidean space \(\textbf{R}^n\) and each \(\epsilon >0\) we can adjust parameters of MLP in such a way that the output of \(f_{net}(q)\) of the network is \(\epsilon\)close to f(q) for all entries q from the domain D^{33}.
The UDA concept generalizes universal approximation for the networks. Many evolution equations under reasonable boundary conditions define global semiflows in the appropriate functional phase spaces^{34,35}. We denote such semiflows by \(S^t_\textbf{P}\) and they depend on the parameters \(\textbf{P}\) involved in equations, and boundary conditions. More formally, let us consider an evolution equation in a Banach space \(\textbf{B}\) depending on the parameter \(\textbf{P}\):
Assume that for some \(\textbf{P}\) this equation generates a global semiflow \(S^t\). We obtain then a family \({\mathscr {F}}\) of global semiflows \(S^t_\textbf{P}\) where each semiflow depends on the parameter \(\textbf{P}\).
Suppose for an integer \(n> 0\) there is an appropriate value \(\textbf{P}_n\) of the parameter \(\textbf{P}\) such that the corresponding global semiflow \(S^t_{\textbf{P}_n}\) has an ndimensional normally hyperbolic locally invariant manifold \({{\mathscr {M}}}_n\) embedded in our phase space \(\textbf{B}\) by a \(C^1\)smooth map defined on a ball \(\textbf{B}^n \subset \textbf{R}^n\).
The restriction of semiflow \(S^t_{\textbf{P}_n}\) to \({{\mathscr {M}}}_n\) is defined then by a vector field Q on \({{\mathscr {M}}}^n\). Then we say that the family \(S^t_\textbf{P}\) realizes the vector field Q (this terminology is coined by Poláčik^{26,36}).
The family \(S^t_\textbf{P}\) enjoys UDA if for each dimension n this family realizes a dense (in the norm \(C^1(\textbf{B}^n)\)) set of vector fields Q on the ball \(\textbf{B}^n\).
Corollary 1
If the family \(S^t_\textbf{P}\) has UDA property, the Theorem on Persistence of Hyperbolic Sets implies that some semiflows \(S^t_\textbf{P}\) exhibit a chaotic largetime behavior.
In other words, one can say that the UDA semiflows can simulate, by parameter variations, any finitedimensional dynamics defined by a system of ordinary differential equations on the domain \(D \subset \textbf{R}^n\) within any prescribed accuracy \(\epsilon\) (in \(C^1(D)\)norm). This property implies that semiflows \(S_\textbf{P}^t\) can generate all structurally stable dynamics (up to orbital topological equivalency). Among the systems enjoying UDA, there are a number of fundamental ones: quasilinear parabolic equations^{26,36}, timecontinuous and timerecurrent neural networks^{37}, a large class of reactiondiffusion systems with heterogeneous sources^{9}, generalized LotkaVolterra system^{38}, OberbeckBoussinesq model^{39}. Also, the Euler equations on multidimensional manifolds exhibit similar properties^{32}. Note that for time continuous and time recurrent neural networks and generalized LotkaVolterra system the UDA property follows from Universal Approximation Theorem for MLP^{37}. Such systems could be Turing complete and used as TMs.
Attractor and pattern formation mechanisms
We follow the work^{9} and for simplicity let us restrict ourselves by twocomponent case \(m=2\) (see Eqs. 2 and 3). Let the corresponding diffusion coefficients be sharply different: \(d_1=d_u=\epsilon ^2<< d_2=D_v\), where \(\epsilon >0\) is a small parameter. The main idea is as follows. We use local excitation for slow component u and longrange nonlocal interaction between v and u. First spatial gradients \(\eta , \zeta\) cause the formation of many welllocalized segments, where the slowly diffusing reagent u is concentrated.
This reagent concentration u interacts with fast diffusing component v, which, in turn, acts on u and so we obtain nonlinear nonlocal feedback and complex dynamics. Dynamics induced by each group of interacting segments generate different local attractors. We can have at least \(N_{loc}\) of such groups. So, by adjusting different initial data for u, v we can obtain \(2^{N_{loc}}\) of different local attractors, and each attractor (if it is chaotic and structurally stable) can generate the corresponding CDPs. So, we show that fairly general chemical reactors with spatial gradients can create chemical patterns consisting of many narrow segments and those patterns help to activate a number of different CDPs.
In more detail, for each \(N =O({\epsilon }^{1/2})\) we can adjust such gradients \(\eta , \zeta\) that the ucomponent is a linear combination of N fixed smooth functions \(\psi _i(x)\) with coefficients \(X_i(t)\) slowly depending on time. Each \(\psi _i\) is well localized at a point \({\bar{x}}_i\): it is an exponentially decreasing function of \(\epsilon ^{1}(x{\bar{x}}_i)\). Then the reagent u is concentrated at the right lines \(x={\bar{x}}_i\) and the expression level of u at ith line is determined by the magnitude \(X_i\). Thus we are dealing with an analog of the segmentation process omnipresent in morphogenesis (in fact, it reminds segmentation in Drosophila, where the morphogen Bicoid gradient causes the formation of segments as a result of a complex interaction with other proteins^{40,41,42}). The large time dynamics of the magnitudes \(X_i\) are defined by a big Ncomponent coupled oscillator system. The ith oscillator with the state \(X_i(t)\) interacts with other ones via linear interaction terms. Due to localization properties of \(\psi _i\) if the distances between oscillators are much more than \(\epsilon\) this interaction force between oscillators is exponentially weak. Using this interaction fading property we can decompose the coupled oscillator system into \(M_a\) almost independent dynamical systems, where each system involves \({\bar{n}}=N/M_a\) variables. For this end, we decompose the points \({\bar{x}}_i\), \(i=1, ..., N\) into groups \(\bar{x}_{j, k}\), where \(k=1, ..., M_a\) and \(j=1, ..., {\bar{n}}\). The points of the kth group are located close to each other (distances of the order \(\epsilon ^{1/2}\)) while segments belonging to different groups are separated (distances between segments from different groups are \(> 1/2N\)). By a variation of the system (Eq. 1) parameters, one can obtain the following. The kth segment group generates an ndimensional dynamical system, which has at least two local hyperbolic attractors \({{\mathscr {A}}}_k^{(l)}\), \(l=1,2\). The whole big dynamical system for all magnitudes \(X_i\) consists then of \(M_a\) almost independent (up to exponentially small corrections) \({\bar{n}}\)dimensional dynamical systems. This big system has then at least \(2^{M_a}\) different local attractors.
Transformation of chaotic time dynamics into developmental programs
In this subsection, we describe how chaotic (or stochastic) gene signals in time can be transformed into cellular patterns of terminal differentiation in space. Let us state first a formal mathematical definition of the cell developmental program. Such programs transform smooth morphogenetic fields into discrete cell patterns.
Let us consider strings \(s=(s_1, s_2, ..., s_{m})\) consisting, for example, of symbols \(s_j \in \{r, b\}\) (an extension to larger alphabets is straightforward). The set of strings \({{\mathscr {S}}}_D=\{ s^{(1)}, s^{(2)}, ..., s^{(m)} \}\), the time moments \(t_j=j\Delta T\) and points \({\bar{x}}_1< {\bar{x}}_2< \cdots < {\bar{x}}_n\), where \(j=1,..., n\), can be called a developmental program if the morphogen values take prescribed values at given time moments \(t_k\) at given points \({\bar{x}}_j\):
where a morphogenetic operator \({{\mathscr {M}}}\) transforms 1D patterns u(x) into strings s of the length n. This formal definition is illustrated by Fig. 1. In the simplest case, similarly to the French flag model, one has \({{\mathscr {M}}}(u(\cdot )) ({\bar{x}}_j)=r\) if \(u({\bar{x}}_j)>u_0\) and \({{\mathscr {M}}}(u(\cdot )) (\bar{x}_j)=b\) otherwise. For a more complicated example connected with a partition of the phase space of morphogen concentrations, see Fig. 2.
We use the results of the work^{11,12,13} and it can be described as follows. The idea can be illustrated by Fig. 1. The left column of the cells is a string \(\textbf{a}_1=a(1, t_1) a(1, t_2) \cdots a(1, t_K)\), \(a_i \in \{r, b\}\), the next one is a string \(\textbf{a}_2\), and the last column is a string \(\textbf{a}_{N_c}(K)\), where \(N_c\) is the number of the cells, K is the number of steps in time, and \(a(j, t_k)\) denotes the type of the cell emerged at j position along xaxis at the moment \(t_k\). Each column (the string, or the 1Dpattern) can be generated by a TM. In turn, these TMs can be simulated by fewdimensional chaotic dynamical systems^{11,12}. This dynamical system is defined by interactions between the amplitudes \(X_{i}(t)\) corresponding to the kth segment group (see the previous subsection). So, we set \(M_a=N_c\), further we take \(M_a\) chaotic dynamical systems and each system generates the prescribed string (the column on Fig. 1) at a moment in time.
Let us consider first how to generate 1Dpatterns consisting of the cells of two types, say, red and blue cells. Imagine an arbitrary string of cells of different types located along the xaxis (this might model 1Dorganisms, like a worm, or a segmented embryo, see Fig. 1). Let w be a morphogen and its concentration is bounded, \(w \le w_{+}\). Suppose the red cells appear when the average concentration w in the cell lies in \(W_r=(0, w_{+}/2)\) and the blue cells emerge when this concentration is more than \(w_{+}/2\): \(w \in W_b=(w_{+}/2, w_{+})\). Therefore, to generate such strings, we should satisfy conditions \({\bar{w}}(j, t_k) \in W_{a(j,t_k)}\) for all \(j=1,..., N_c, \ k=1,..., K\), where \({\bar{w}}(j, k)\) is the morphogen concentration averaged over jth cell at the moment \(t_k\) of cell differentiation. Suppose that ureagent serves as a source of the morphogen w: \(d_w \Delta w  b^2 w=uu_0\), where the parameter \(\beta =b{d_w}^{1/2}\) determines the rate of decreasing wconcentration in space and \(u_0(x,y)\) is a function. We adjust this constant in an appropriate way. The conditions \({\bar{w}}(j, t_k) \in W_{a(j,t_k)}\) can be satisfied if the time evolution of the amplitudes \(X_j(t)\) corresponding to segments located at the jth cell is governed by chaotic (or stochastic) dynamics. Note that the precise choice of this dynamics plays no role, but the corresponding attractor should be ergodic and mixing^{13}.
Discussion
The phenotype morphogenesis and its robustness are considered. The pattern formation robustness problem initiated by seminal papers^{15,16} was under intensive discussion last decades^{17,18,20,42,43} among many others, see the review^{21}. A number of different approaches have been proposed: heat shock proteins and other molecular chaperones, methylation, microRNAs, emergent or embedded mechanisms, which are based on gene regulation network properties^{18,42,44} and nonlinearities in development.
The approach, proposed in this paper, looks directly opposite to many previous ones, where the formation of targeted patterns and robustness are achieved by a delicate tuning of gene network properties^{42,44,45}. Based on Turing machine theory and the results of dynamical system theory, we come to counterintuitive conclusions: a delicate tuning of reaction parts or network interconnections is not obligatory to generate complex target patterns and attractors in a robust manner. The general idea can be formulated as follows: timing instead of tuning. We can take any dynamical system, which generates complicated Turing complete dynamics (now we know a number of such systems) and then, if the target pattern is predetermined, this dynamics generates this pattern. The mechanism of pattern formation is robust with respect to noise and other perturbations if it goes in several stages, each of which, upon reaching a certain terminal state, ends up (similarly to Turing machines) with a stop signal. Such multistage morphogenesis, where each stage finishes with a stop signal, is robust with respect to perturbations. The idea that evolution can use any means to produce effective structures (evolutionary tinkering) was proposed by Jacob^{6}. Here, we assert that practically any dynamical system with complicated largetime behavior can be used to generate complicated patterns.
For popular reactiondiffusion models, our mechanism of complex pattern and attractor generation is based on two fundamental ideas: a generalisation of Turing instability^{2}, and on the Wolpert concept of positional information^{1}. Note that the idea to use spatial heterogeneity was proposed still in the paper^{2}. One can take sufficiently general chemical interaction and just adjust the correct positional information. Note that such robust morphogenesis can be performed via practically arbitrary chemical dynamics (if we adjust correctly spatially extended gradients). In fact, we show that if Wolpert’s and Turing’s pattern formation mechanisms work together then they can create, in a practically arbitrary nonlinear chemical reactor (corresponding to an open chemical system), complicated attractors of high complexity and even produce a number of such attractors and the corresponding cell patterns.
Note that the experimental data show that many genes exhibit an oscillatory behavior^{46,47}. A model describing the chaotic behavior of transcription factors (TFs) is proposed in the research^{47}, where it is shown that such a chaotic behavior could be useful for survival and leads to the formation of heterogeneous cell populations. Note that oscillatory gene dynamics was used to describe somitogenesis, the wellstudied examples of pattern formation in the developing embryo are presented here^{7,8,46} and the first mathematical model is introduced here^{48}. According to a popular formulation of this problem appearing in encyclopedias such as Wikipedia, “Somites are bilaterally paired blocks of paraxial mesoderm that form along the anterior–posterior axis of the developing embryo in segmented animals. In vertebrates, somites give rise to skeletal muscle, cartilage, tendons, endothelium, and dermis”. Although at an initial stage, somites manifest a periodical layered structure really the intrinsic states of cells are not periodic (otherwise, it would be impossible to obtain complex organs with further development). So, the true pattern is not periodic, and we believe, that instead of timeperiodical dynamics chaotic dynamics should generate such nonperiodicity in spacelayered patterns.
These results show that there are chemical systems, which can be considered in the light of the analogy of Universal Turing machines (UTMs). A UTM can make all computations, which can be done by other TMs, and so, UTMs generate all possible string outputs when we vary their input. In our case, we have fixed (up to a few parameters to adjust) spatially extended systems, which, depending on initial data, generate different developmental patterns. So, open chemical systems of fairly general form serve as Universal Generators of spatiotemporal patterns and that generation is based on a complicated gene expression, chaotic or stochastic. A few genes are sufficient to encode those UPGs. Therefore, it is natural to expect that such chemical media could appear as a result of biological evolution.
Data availability
All data generated or analyzed during this study are included in this published article and its supplementary information files.
References
Wolpert, L., Tickle, C. & Jessell, T. Principles of Development (Oxford University Press, 2002).
Turing, A. The chemical basis of morphogenesis. Phil. Trans. R. Soc. B 237, 37–72 (1952).
Vadde, B. V. L. & Roeder, A. H. K. Can the French flag and reactiondiffusion models explain flower patterning? Celebrating the 50th anniversary of the French flag model. J. Exp. Bot. 71, 2886–2897. https://doi.org/10.1093/jxb/eraa065 (2020).
Green, B. A. & Sharpe, J. Positional information and reactiondiffusion: Two big ideas in developmental biology combine. Development 142, 1203–1211. https://doi.org/10.1242/dev.114991 (2015).
Brenner, S. Life’s code script. Nature 482, 461–461 (2012).
Jacob, F. Evolution and tinkering. Science 196, 1161–1166 (1977).
Maroto, M., Bone, R. A. & Dale, J. Somitogenesis. Development 139, 2453–2456 (2012).
Resende, T. et al. Sonic hedgehog in temporal control of somite formation. PNAS 107, 12907–12912 (2010).
Vakulenko, S. Complex attractors and patterns in reactiondiffusion systems. J. Dyn. Diff. Equat. 30, 175–207 (2018).
Meinhardt, H. Turing’s theory of morphogenesis of 1952 and the subsequent discovery of the crucial role of local selfenhancement and longrange inhibition. Interface Focus 2, 407–416 (2012).
Moore, C. Unpredactibility and undecidability in dynamical systems. Phys. Rev. Lett 64, 2354–2357 (1990).
Moore, C. Generalized shifts: Unpredactibility and undecidability in dynamical systems. Nonlinearity 4, 199–230 (1991).
Vakulenko, S. A. & Grigoriev, D. New way for cell differentiation: Reaction, diffusion and chaotic waves. Biosystems 212, 104605. https://doi.org/10.1016/j.biosystems.2021.104605 (2022).
Sudakow, I., Vakulenko, S. A. & Grigoriev, D. Excitable media store and transfer complicated information via topological defect motion. Commun. Nonlinear Sci. Numer. Simulat. 116, 106844. https://doi.org/10.1016/j.cnsns.2022.106844 (2023).
Waddington, C. H. Canalization of development and the inheritance of acquired characters. Nature 150, 563–565. https://doi.org/10.1038/150563a0 (1942).
Waddington, C. H. Genetic assimilation of an acquired character. Evolution 7, 118–126. https://doi.org/10.2307/2405747 (1953).
Bergman, A. & Siegal, M. Evolutionary capacitance as a general feature of complex gene networks. Nature 424, 549–552 (2003).
Siegal, M. L. & Bergman, A. Waddington’s canalization revisited: Developmental stability and evolution. Proc. Natl. Acad. Sci. USA 99, 10528–32 (2002).
Levy, S. & Siegal, M. Network hubs buffer environmental variation in Saccharomyces cerevisiae. PLoS Biol. (2008).
Barr, K., Reinitz, J. & Radulescu, O. An in silico analysis of robust but fragile gene regulation links enhancer length to robustness. PLoS Comput. Biol. 15(11), e1007497 (2019).
Hallgrimsson, B. The developmentalgenetics of canalization. Semin. Cell Dev. Biol. 88, 67–79. https://doi.org/10.1016/j.semcdb.2018.05.019 (2019).
Siegelmann, H. T. Computation beyond the turing limit. Science.https://doi.org/10.1126/science.268.5210.545 (1995).
Horsthemke, W. & Lefever, R. NoiseInduced Transitions: Theory and Applications in Physics, Chemistry, and Biology (SpringerVerlag, 1984).
Katok, A. & Hasselblatt, B. Introduction to the Modern Theory of Dynamical Systems (Cambridge University Press, 1997).
Kifer, Y. I. On small random perturbations of some smooth dynamical systems. Izv. Akad. Nauk SSSR Ser. Mat. 38, 1083–1107 (1974).
Poláčik, P. Complicated dynamics in scalar semilinear parabolic equations in higher space dimension. J. Differ. Eqs. 89, 244–271 (1991).
Thom, R. Structural Stability and Morphogenesis (W. A. Benjamin, 1972).
Reinitz, J. & Sharp, D. H. Mechanism of formation of eve stripes. Mech. Develop. (1995).
Newman, S. Cell differentiation: What have we learned in 50 years?. J. Theoret. Biol. 485, 110031 (2020).
Newman, S. Physicogenetic determinants in the evolution of development. Science 338, 217–219 (2012).
Moore, C. & Mertens, S. The Nature of Computation (Oxford University Press, 2011).
Tao, T. On the universality of the incompressible Euler equations on compact manifolds. Discrete Continuous Dynam. Syst.https://doi.org/10.3934/dcds.2018064 (2018).
Barron, A. R. Universal approximation bounds for superpositions of a sigmoidal function. Inform. Theory IEEE Trans. 39, 930–945 (1993).
Henry, D. Geometric Theory of Semilinear Parabolic Equations Vol. 840 (SpringerVerlag, 1981).
Constantin, P., Foias, C., Nicolaenko, V. & Temam, R. Integral manifolds and inertial manifolds for dissipative partial differential equations. SpringerVerlag Appl. Math. Sci. Series. 70, 126 (1988).
Poláčik, P. Highdimensional \(\omega\)limit sets and chaos in scalar parabolic equations. J. Differ. Eqs. 119, 24–53 (1995).
Vakulenko, S., Morozov, I. & Radulescu, O. Maximal switchability of centralized networks. Nonlinearity 29, 2327 (2016).
Kozlov, V. & Vakulenko, S. On chaos in LotkaVolterra systems: An analytical approach. Nonlinearity 26, 2299 (2013).
Vakulenko, S. A. Strange attractors for OberbeckBoussinesq model. J. Dyn. Differ. Eqs. 33, 303–343 (2021).
Mjolness, E., Sharp, D. H. & Reinitz, J. A connectionist model of development. J. Theor. Biol. 152, 429–453 (1991).
Surkova, S. et al. Canalization of gene expression in the Drosophila blastoderm by gap gene cross regulation. PLoS Biol. 7, e1000049 (2009).
Manu, et al. Canalization of gene expression in the drosophila blastoderm by gap gene cross regulation. PLoS Biol. 7, e1000049. https://doi.org/10.1371/journal.pbio.1000049 (2009).
Masel, J. & Siegal, M. L. Robustness: Mechanisms and consequences. Trends Genet. 25, 395–403. https://doi.org/10.1016/j.tig.2009.07.005 (2009).
Jiang, P., Ludwig, M. L., Kreitman, M. & Reinitz, J. Natural variation of the expression pattern of the segmentation gene evenskipped in drosophila melanogaster. Dev. Biol. 405, 173–181 (2015).
Vakulenko, S., Radulescu, O. & Reinitz, J. Size regulation in the segmentation of drosophila: Interacting interfaces between localized domains of gene expression ensure robust spatial patterning. Phys. Rev. Lett. 103, 168102–168106 (2009).
Pourquié, O. The chick embryo: A leading model for model in somitogenesis studies. Mech. Dev. 121, 1069–1079 (2004).
Heltberg, M. L., Krishna, S. & Jensen, M. On chaotic dynamics in transcription factors and the associated effects in differential gene regulation. Nat. Commun.https://doi.org/10.1038/s41467018079321 (2019).
Baker, R., Schnell, S. & Maini, P. A clock and wavefront mechanism for somite formation. Develop. Biol. 293, 116–126 (2006).
Acknowledgements
S. V. and J. R. are supported by the grant of the National Institutes of Health (NIH) 2R01 OD010936. I.S. gratefully acknowledges support from the Division of Physics at the National Science Foundation (NSF) through Grant No. PHY2102906. Part of this research was performed while I.S. was visiting the Institute for Mathematical and Statistical Innovation (IMSI), which is supported by the NSF Grant no. DMS1929348. This research was also supported in part by NSF Grant no. PHY1748958, Gordon and Betty Moore Foundation Grant no. 2919.02, and the Kavli Foundation. S. V. was also supported by the Ministry of Science and Higher Education of the Russian Federation by the Agreement—075152022291 on the provision of a grant in the form of subsidies from the federal budget for the implementation of state support for the establishment and development of the worldclass scientific center “Pavlov center Integrative physiology for medicine, hightech healthcare, and stressresilience technologies”. We are grateful to Prof. O. Radulescu for his interesting and useful comments.
Author information
Authors and Affiliations
Contributions
J.R., S.V., I.S., and D.G. proposed the concept. S.V. wrote the text. I.S. and D.G. edited the paper. I.S. administrated project.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Publisher's note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Reinitz, J., Vakulenko, S., Sudakow, I. et al. Robust morphogenesis by chaotic dynamics. Sci Rep 13, 7482 (2023). https://doi.org/10.1038/s4159802334041x
Received:
Accepted:
Published:
DOI: https://doi.org/10.1038/s4159802334041x
This article is cited by

Evolution of biological cooperation: an algorithmic approach
Scientific Reports (2024)
Comments
By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.