Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Perspective
  • Published:

Generative learning for nonlinear dynamics

Abstract

Modern generative machine learning models are able to create realistic outputs far beyond their training data, such as photorealistic artwork, accurate protein structures or conversational text. These successes suggest that generative models learn to effectively parametrize and sample arbitrarily complex distributions. Beginning half a century ago, foundational works in nonlinear dynamics used tools from information theory for a similar purpose, namely, to infer properties of chaotic attractors from real-world time series. This Perspective article aims to connect these classical works to emerging themes in large-scale generative statistical learning. It focuses specifically on two classical problems: reconstructing dynamical manifolds given partial measurements, which parallels modern latent variable methods, and inferring minimal dynamical motifs underlying complicated data sets, which mirrors interpretability probes for trained models.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Chaos as a generative process.
Fig. 2: Latent dynamics revisit classical attractor reconstruction.
Fig. 3: State-space models generate complex dynamics.
Fig. 4: Latent discretization and interpretability.

Similar content being viewed by others

References

  1. Crutchfield, J. & Packard, N. Symbolic dynamics of one-dimensional maps: entropies, finite precision, and noise. Int. J. Theor. Phys. 21, 433–466 (1982).

    Article  MathSciNet  Google Scholar 

  2. Cvitanovic, P. et al. in Chaos: Classical and Quantum Vol. 69, 25 (2005).

  3. Farmer, J. D. Information dimension and the probabilistic structure of chaos. Z. Naturforsch. A 37, 1304–1326 (1982).

    Article  ADS  MathSciNet  Google Scholar 

  4. Feynman, R. P. Feynman Lectures on Computation (CRC, 2018).

  5. Wheeler, J. A. “On recognizing ‘law without law’,” Oersted medal response at the joint APS–AAPT Meeting, New York, 25 January 1983. Am. J. Phys. 51, 398–404 (1983).

    Article  ADS  Google Scholar 

  6. Wheeler, J. A. Recent thinking about the nature of the physical world: it from bit a. Ann. N. Y. Acad. Sci. 655, 349–364 (1992).

    Article  ADS  Google Scholar 

  7. Shaw, R. Strange attractors, chaotic behavior, and information flow. Z. Naturforsch. A 36, 80–112 (1981).

    Article  ADS  MathSciNet  Google Scholar 

  8. Pompe, B., Kruscha, J. & Leven, R. State predictability and information flow in simple chaotic systems. Z. Naturforsch. A 41, 801–818 (1986).

    Article  ADS  MathSciNet  Google Scholar 

  9. Crutchfield, J. P. & Young, K. Inferring statistical complexity. Phys. Rev. Lett. 63, 105 (1989).

    Article  ADS  MathSciNet  CAS  PubMed  Google Scholar 

  10. Grassberger, P. Information and complexity measures in dynamical systems. In Proc. NATO Advanced Study Institute on Information Dynamics 15–33 (Springer, 1991).

  11. Sauer, T., Yorke, J. A. & Casdagli, M. Embedology. J. Stat. Phys. 65, 579–616 (1991).

    Article  ADS  MathSciNet  Google Scholar 

  12. Pesin, Y. B. Characteristic Lyapunov exponents and smooth ergodic theory. Russ. Math. Surv. 32, 55 (1977).

    Article  Google Scholar 

  13. Gilpin, W. Cryptographic hashing using chaotic hydrodynamics. Proc. Natl Acad. Sci. USA 115, 4869–4874 (2018).

    Article  ADS  MathSciNet  CAS  PubMed  PubMed Central  Google Scholar 

  14. Sinai, Y. G. Gibbs measures in ergodic theory. Russ. Math. Surv. 27, 21 (1972).

    Article  ADS  MathSciNet  Google Scholar 

  15. Blei, D. M., Kucukelbir, A. & McAuliffe, J. D. Variational inference: a review for statisticians. J. Am. Stat. Assoc. 112, 859–877 (2017).

    Article  MathSciNet  CAS  Google Scholar 

  16. Goodfellow, I., Bengio, Y. & Courville, A. Deep Learning (MIT Press, 2016).

  17. Edelman, A., Arias, T. A. & Smith, S. T. The geometry of algorithms with orthogonality constraints. SIAM J. Matrix Anal. Appl. 20, 303–353 (1998).

    Article  MathSciNet  Google Scholar 

  18. Sohl-Dickstein, J., Weiss, E., Maheswaranathan, N. & Ganguli, S. Deep unsupervised learning using nonequilibrium thermodynamics. In International Conference on Machine Learning 2256–2265 (PMLR, 2015).

  19. Song, Y. & Ermon, S. Generative modeling by estimating gradients of the data distribution. In 33rd Conference on Neural Information Processing Systems (NeurIPS, 2019).

  20. Pandarinath, C. et al. Inferring single-trial neural population dynamics using sequential auto-encoders. Nat. Methods 15, 805–815 (2018).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  21. Koppe, G., Toutounji, H., Kirsch, P., Lis, S. & Durstewitz, D. Identifying nonlinear dynamical systems via generative recurrent neural networks with applications to fMRI. PLoS Comput. Biol. 15, e1007263 (2019).

    Article  ADS  CAS  PubMed  PubMed Central  Google Scholar 

  22. Yousif, M. Z., Yu, L. & Lim, H.-C. High-fidelity reconstruction of turbulent flow from spatially limited data using enhanced super-resolution generative adversarial network. Phys. Fluids 33, 125119 (2021).

    Article  ADS  CAS  Google Scholar 

  23. Bowen, R. & Ruelle, D. The ergodic theory of axiom a flows. Invent. Math. 29, 181–202 (1975).

    Article  ADS  MathSciNet  Google Scholar 

  24. Gershenfeld, N. An experimentalist’s introduction to the observation of dynamical systems. In Directions in Chaos Vol. 2, 310–353 (World Scientific, 1988).

  25. Abarbanel, H. D., Brown, R., Sidorowich, J. J. & Tsimring, L. S. The analysis of observed chaotic data in physical systems. Rev. Mod. Phys. 65, 1331 (1993).

    Article  ADS  MathSciNet  Google Scholar 

  26. Bahri, Y. et al. Statistical mechanics of deep learning. Annu. Rev. Condens. Matter Phys. 11, 501–528 (2020).

    Article  ADS  Google Scholar 

  27. Karniadakis, G. E. et al. Physics-informed machine learning. Nat. Rev. Phys. 3, 422–440 (2021).

    Article  Google Scholar 

  28. Brunton, S. L., Budisi’c, M., Kaiser, E. & Kutz, J. N. Modern Koopman theory for dynamical systems. SIAM Rev. 64, 229–340 (2022).

    Article  MathSciNet  Google Scholar 

  29. Mezić, I. Analysis of fluid flows via spectral properties of the Koopman operator. Annu. Rev. Fluid Mech. 45, 357–378 (2013).

    Article  ADS  MathSciNet  Google Scholar 

  30. Otto, S. E. & Rowley, C. W. Koopman operators for estimation and control of dynamical systems. Annu. Rev. Control Robot. Auton. Syst. 4, 59–87 (2021).

    Article  Google Scholar 

  31. Ghadami, A. & Epureanu, B. I. Data-driven prediction in dynamical systems: recent developments. Philos. Trans. Royal Soc. A 380, 20210213 (2022).

    Article  ADS  Google Scholar 

  32. Fefferman, C., Mitter, S. & Narayanan, H. Testing the manifold hypothesis. J. Am. Math. Soc. 29, 983–1049 (2016).

    Article  MathSciNet  Google Scholar 

  33. Boumal, N. An Introduction to Optimization on Smooth Manifolds (Cambridge Univ. Press, 2023).

  34. Takens, F. Detecting strange attractors in turbulence. In Dynamical Systems and Turbulence, Warwick 1980: Proceedings of a Symposium Held at the University of Warwick 1979/80, 366–381 (Springer, 1980).

  35. Packard, N. H., Crutchfield, J. P., Farmer, J. D. & Shaw, R. S. Geometry from a time series. Phys. Rev. Lett. 45, 712 (1980).

    Article  ADS  Google Scholar 

  36. Bechhoefer, J. Control Theory for Physicists (Cambridge Univ. Press, 2021).

  37. Brandstäter, A. et al. Low-dimensional chaos in a hydrodynamic system. Phys. Rev. Lett. 51, 1442 (1983).

    Article  ADS  MathSciNet  Google Scholar 

  38. Ruelle, D. & Takens, F. On the nature of turbulence. Commun. Math. Phys 20, 167–192 (1971).

    Article  ADS  MathSciNet  Google Scholar 

  39. Casdagli, M. Nonlinear prediction of chaotic time series. Phys. D 35, 335–356 (1989).

    Article  MathSciNet  Google Scholar 

  40. Sugihara, G. & May, R. M. Nonlinear forecasting as a way of distinguishing chaos from measurement error in time series. Nature 344, 734–741 (1990).

    Article  ADS  CAS  PubMed  Google Scholar 

  41. Tsonis, A. & Elsner, J. Nonlinear prediction as a way of distinguishing chaos from random fractal sequences. Nature 358, 217–220 (1992).

    Article  ADS  Google Scholar 

  42. Ott, E., Grebogi, C. & Yorke, J. A. Controlling chaos. Phys. Rev. Lett. 64, 1196 (1990).

    Article  ADS  MathSciNet  CAS  PubMed  Google Scholar 

  43. Petropoulos, F. et al. Forecasting: theory and practice. Int. J. Forecast. 38, 705–871 (2022).

    Article  Google Scholar 

  44. Gershenfeld, N., Schoner, B. & Metois, E. Cluster-weighted modelling for time-series analysis. Nature 397, 329–332 (1999).

    Article  ADS  CAS  Google Scholar 

  45. Durbin, J. & Koopman, S. J. Time Series Analysis by State Space Methods Vol. 38 (Oxford Univ Press, 2012).

  46. Girin, L. et al. Dynamical variational autoencoders: a comprehensive review. Found. Trends Mach. Learn. 15, 1–175 (2021).

    Article  Google Scholar 

  47. Floryan, D. & Graham, M. D. Data-driven discovery of intrinsic dynamics. Nat. Mach. Intell. 4, 1113–1120 (2022).

    Article  Google Scholar 

  48. Doering, C. R. & Gibbon, J. D. Applied Analysis of the NavierStokes Equations Vol. 12 (Cambridge Univ. Press, 1995).

  49. Ott, E. & Antonsen, T. M. Low dimensional behavior of large systems of globally coupled oscillators. Chaos 18, 037113 (2008).

    Article  ADS  MathSciNet  PubMed  Google Scholar 

  50. Blanchard, A. & Sapsis, T. P. Learning the tangent space of dynamical instabilities from data. Chaos 29, 113120 (2019).

    Article  ADS  MathSciNet  PubMed  Google Scholar 

  51. Cenedese, M., Axås, J., Bäuerlein, B., Avila, K. & Haller, G. Data-driven modeling and prediction of non-linearizable dynamics via spectral submanifolds. Nat. Commun. 13, 872 (2022).

    Article  ADS  CAS  PubMed  PubMed Central  Google Scholar 

  52. Berry, T., Giannakis, D. & Harlim, J. Nonparametric forecasting of low-dimensional dynamical systems. Phys. Rev. E 91, 032915 (2015).

    Article  ADS  Google Scholar 

  53. Gilpin, W. Deep reconstruction of strange attractors from time series. In Advances in Neural Information Processing Systems Vol. 33 (NeurIPS, 2020).

  54. Chen, B. et al. Automated discovery of fundamental variables hidden in experimental data. Nat. Comput. Sci. 2, 433–442 (2022).

    Article  PubMed  Google Scholar 

  55. Page, J., Brenner, M. P. & Kerswell, R. R. Revealing the state space of turbulence using machine learning. Phys. Rev. Fluids 6, 034402 (2021).

    Article  ADS  Google Scholar 

  56. Greydanus, S., Dzamba, M. & Yosinski, J. Hamiltonian neural networks. In Advances in Neural Information Processing Systems Vol. 32 (NeurIPS 2019).

  57. Linot, A. J. & Graham, M. D. Deep learning to discover and predict dynamics on an inertial manifold. Phys. Rev. E 101, 062209 (2020).

    Article  ADS  MathSciNet  CAS  PubMed  Google Scholar 

  58. Lefebvre, J., Goodings, D., Kamath, M. & Fallen, E. Predictability of normal heart rhythms and deterministic chaos. Chaos 3, 267–276 (1993).

    Article  ADS  PubMed  Google Scholar 

  59. Sugihara, G. Nonlinear forecasting for the classification of natural time series. Philos. Trans. Royal Soc. A Phys. Eng. Sci. 348, 477–495 (1994).

    ADS  Google Scholar 

  60. Casdagli, M. Chaos and deterministic versus stochastic non-linear modelling. J. R. Stat. Soc. Ser. B 54, 303–328 (1992).

    MathSciNet  Google Scholar 

  61. Broock, W. A., Scheinkman, J. A., Dechert, W. D. & LeBaron, B. A test for independence based on the correlation dimension. Econom. Rev. 15, 197–235 (1996).

    Article  MathSciNet  Google Scholar 

  62. Champion, K., Lusch, B., Kutz, J. N. & Brunton, S. L. Data-driven discovery of coordinates and governing equations. Proc. Natl Acad. Sci. USA 116, 22445–22451 (2019).

    Article  ADS  MathSciNet  CAS  PubMed  PubMed Central  Google Scholar 

  63. Udrescu, S.-M. et al. AI Feynman 2.0: pareto-optimal symbolic regression exploiting graph modularity. Adv. Neural Inform. Process. Syst. 33, 4860–4871 (2020).

    Google Scholar 

  64. Chen, R. T., Rubanova, Y., Bettencourt, J. & Duvenaud, D. K. Neural ordinary differential equations. In 32nd Conference on Neural Information Processing Systems (NeurIPS, 2018).

  65. Choudhary, A. et al. Physics-enhanced neural networks learn order and chaos. Phys. Rev. E 101, 062207 (2020).

    Article  ADS  CAS  PubMed  Google Scholar 

  66. Toth, P. et al. Hamiltonian generative networks. In International Conference on Learning Representations (2019).

  67. Brown, R., Rulkov, N. F. & Tracy, E. R. Modeling and synchronizing chaotic systems from time-series data. Phys. Rev. E 49, 3784 (1994).

    Article  ADS  CAS  Google Scholar 

  68. Julier, S. J. & Uhlmann, J. K. Unscented filtering and nonlinear estimation. Proc. IEEE 92, 401–422 (2004).

    Article  Google Scholar 

  69. Reif, K., Gunther, S., Yaz, E. & Unbehauen, R. Stochastic stability of the discrete-time extended Kalman filter. IEEE Trans. Autom. Control. 44, 714–728 (1999).

    Article  MathSciNet  Google Scholar 

  70. Kaplan, D. T. Model-independent technique for determining the embedding dimension. in Chaos in Communications, Vol. 2038, 236–240 (SPIE, 1993).

  71. Gershenfeld, N. A. Dimension measurement on high-dimensional systems. Phys. D 55, 135–154 (1992).

    Article  Google Scholar 

  72. Hoffmann, J. et al. Training compute-optimal large language models. Preprint at https://arXiv.org/abs/2203.15556 (2022).

  73. Schmid, P. J. Dynamic mode decomposition of numerical and experimental data. J. Fluid Mech. 656, 5–28 (2010).

    Article  ADS  MathSciNet  CAS  Google Scholar 

  74. Haller, G. Lagrangian coherent structures. Annu. Rev. Fluid Mech. 47, 137–162 (2015).

    Article  ADS  MathSciNet  Google Scholar 

  75. Koopman, B. O. & Neumann, J. V. Dynamical systems of continuous spectra. Proc. Natl Acad. Sci. USA 18, 255–263 (1932).

    Article  ADS  CAS  PubMed  PubMed Central  Google Scholar 

  76. Mezić, I. Spectral properties of dynamical systems, model reduction and decompositions. Nonlinear Dyn. 41, 309–325 (2005).

    Article  MathSciNet  Google Scholar 

  77. Brunton, S. L., Brunton, B. W., Proctor, J. L., Kaiser, E. & Kutz, J. N. Chaos as an intermittently forced linear system. Nat. Commun. 8, 19 (2017).

    Article  ADS  PubMed  PubMed Central  Google Scholar 

  78. Arbabi, H. & Mezic, I. Ergodic theory, dynamic mode decomposition, and computation of spectral properties of the Koopman operator. SIAM J. Appl. Dyn. Syst. 16, 2096–2126 (2017).

    Article  MathSciNet  Google Scholar 

  79. Kamb, M., Kaiser, E., Brunton, S. L. & Kutz, J. N. Time-delay observables for Koopman: theory and applications. SIAM J. Appl. Dyn. Syst. 19, 886–917 (2020).

    Article  MathSciNet  Google Scholar 

  80. Hegger, R., Kantz, H., Matassini, L. & Schreiber, T. Coping with nonstationarity by overembedding. Phys. Rev. Lett. 84, 4092 (2000).

    Article  ADS  CAS  PubMed  Google Scholar 

  81. Budisić, M., Mohr, R. & Mezić, I. Applied koopmanism. Chaos 22, 047510 (2012).

    Article  ADS  MathSciNet  PubMed  Google Scholar 

  82. Nathan Kutz, J., Proctor, J. L. & Brunton, S. L. Applied Koopman theory for partial differential equations and data-driven modeling of spatio-temporal systems. Complexity 2018, 1–16 (2018).

    Article  Google Scholar 

  83. Williams, M. O., Kevrekidis, I. G. & Rowley, C. W. A data-driven approximation of the Koopman operator: extending dynamic mode decomposition. J. Nonlinear Sci. 25, 1307–1346 (2015).

    Article  ADS  MathSciNet  Google Scholar 

  84. Nuske, F., Keller, B. G., Pérez-Hernández, G., Mey, A. S. & Noé, F. Variational approach to molecular kinetics. J. Chem. Theory Comput. 10, 1739–1752 (2014).

    Article  PubMed  Google Scholar 

  85. Takeishi, N., Kawahara, Y. & Yairi, T. Learning Koopman invariant subspaces for dynamic mode decomposition. In 31st Conference on Neural Information Processing Systems (NIPS, 2017).

  86. Lusch, B., Kutz, J. N. & Brunton, S. L. Deep learning for universal linear embeddings of nonlinear dynamics. Nat. Commun. 9, 4950 (2018).

    Article  ADS  PubMed  PubMed Central  Google Scholar 

  87. Wehmeyer, C. & Noé, F. Time-lagged autoencoders: deep learning of slow collective variables for molecular kinetics. J. Chem. Phys. 148, 241703 (2018).

    Article  ADS  PubMed  Google Scholar 

  88. Kaiser, E., Kutz, J. N. & Brunton, S. L. Data-driven discovery of Koopman eigenfunctions for control. Mach. Learn. Sci. Technol. 2, 035023 (2021).

    Article  Google Scholar 

  89. Bollt, E. Regularized kernel machine learning for data driven forecasting of chaos. Annu. Rev. Chaos Theor. Bifurcat. Dyn. Syst. 9, 1–26 (2020).

    Google Scholar 

  90. Li, Q., Dietrich, F., Bollt, E. M. & Kevrekidis, I. G. Extended dynamic mode decomposition with dictionary learning: a data-driven adaptive spectral decomposition of the Koopman operator. Chaos 27, 103111 (2017).

    Article  ADS  MathSciNet  PubMed  Google Scholar 

  91. Qian, E., Kramer, B., Peherstorfer, B. & Willcox, K. Lift & learn: physics-informed machine learning for large-scale nonlinear dynamical systems. Phys. D 406, 132401 (2020).

    Article  MathSciNet  Google Scholar 

  92. Li, Z. et al. Fourier neural operator for parametric partial differential equations. In International Conference on Learning Representations (2020).

  93. De Hoop, M., Huang, D. Z., Qian, E. & Stuart, A. M. The cost-accuracy trade-off in operator learning with neural networks. Preprint at https://arxiv.org/abs/2203.13181 (2022).

  94. Lu, L., Jin, P., Pang, G., Zhang, Z. & Karniadakis, G. E. Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators. Nat. Mach. Intell. 3, 218–229 (2021).

    Article  Google Scholar 

  95. Dupont, E., Doucet, A. & Teh, Y. W. Augmented neural ODEs. In 33rd Conference on Neural Information Processing Systems (NeurIPS, 2019).

  96. Pineda, F. J. Generalization of back-propagation to recurrent neural networks. Phys. Rev. Lett. 59, 2229 (1987).

    Article  ADS  MathSciNet  CAS  PubMed  Google Scholar 

  97. Chua, L. O. & Yang, L. Cellular neural networks: theory. IEEE Trans. Circuits Syst. 35, 1257–1272 (1988).

    Article  MathSciNet  Google Scholar 

  98. Saad, D. & Solla, S. A. On-line learning in soft committee machines. Phys. Rev. E 52, 4225 (1995).

    Article  ADS  CAS  Google Scholar 

  99. Huguet, G. et al. Manifold interpolating optimal-transport flows for trajectory inference. Adv. Neural Inf. Process. Syst. 35, 29705–29718 (2022).

    PubMed  PubMed Central  Google Scholar 

  100. Poole, B., Lahiri, S., Raghu, M., Sohl-Dickstein, J. & Ganguli, S. Exponential expressivity in deep neural networks through transient chaos. In Advances in Neural Information Processing Systems Vol. 29 (NIPS, 2016).

  101. Schoenholz, S. S., Gilmer, J., Ganguli, S. & Sohl-Dickstein, J. Deep information propagation. Preprint at https://arxiv.org/abs/1611.01232 (2016).

  102. Montufar, G. F., Pascanu, R., Cho, K. & Bengio, Y. On the number of linear regions of deep neural networks. In Proc. 27th International Conference on Neural Information Processing Systems (NeurIPS, 2014).

  103. Jacot, A., Gabriel, F. & Hongler, C. Neural tangent kernel: convergence and generalization in neural networks. In 32nd Conference on Neural Information Processing Systems (NeurIPS 2018).

  104. Conte, T. et al. Thermodynamic computing. Preprint at https://arxiv.org/abs/1911.01968 (2019).

  105. Landauer, R. Irreversibility and heat generation in the computing process. IBM J. Res. Dev. 5, 183–191 (1961).

    Article  MathSciNet  Google Scholar 

  106. Morse, M. & Hedlund, G. A. Symbolic dynamics. Am. J. Math. 60, 815–866 (1938).

    Article  MathSciNet  Google Scholar 

  107. Moore, C. Unpredictability and undecidability in dynamical systems. Phys. Rev. Lett. 64, 2354 (1990).

    Article  ADS  MathSciNet  CAS  PubMed  Google Scholar 

  108. Metropolis, N., Stein, M. & Stein, P. On finite limit sets for transformations on the unit interval. J. Comb. Theory Ser. A. 15, 25–44 (1973).

    Article  MathSciNet  Google Scholar 

  109. Hao, B.-l. Symbolic dynamics and characterization of complexity. Phys. D Nonlinear Phenom. 51, 161–176 (1991).

    Article  ADS  MathSciNet  Google Scholar 

  110. Feigenbaum, M. J. The universal metric properties of nonlinear transformations. J. Stat. Phys. 21, 669–706 (1979).

    Article  ADS  MathSciNet  Google Scholar 

  111. Lewis, J. E. & Glass, L. Nonlinear dynamics and symbolic dynamics of neural networks. Neural Comput. 4, 621–642 (1992).

    Article  Google Scholar 

  112. Hao, B.-L. Elementary Symbolic Dynamics and Chaos in Dissipative Systems (World Scientific, 1989).

  113. Daw, C. S., Finney, C. E. A. & Tracy, E. R. A review of symbolic analysis of experimental data. Rev. Sci. Instrum. 74, 915–930 (2003).

    Article  ADS  CAS  Google Scholar 

  114. Langton, C. G. Computation at the edge of chaos: phase transitions and emergent computation. Phys. D 42, 12–37 (1990).

    Article  MathSciNet  Google Scholar 

  115. Wolfram, S. Universality and complexity in cellular automata. Phys. D 10, 1–35 (1984).

    Article  MathSciNet  Google Scholar 

  116. Ghahramani, Z. & Hinton, G. E. Variational learning for switching state-space models. Neural Comput. 12, 831–864 (2000).

    Article  CAS  PubMed  Google Scholar 

  117. Fox, E., Sudderth, E., Jordan, M. & Willsky, A. Nonparametric Bayesian learning of switching linear dynamical systems. In Advances in Neural Information Processing Systems Vol. 21 (NIPS, 2008).

  118. Smith, J., Linderman, S. & Sussillo, D. Reverse engineering recurrent neural networks with Jacobian switching linear dynamical systems. Adv. Neural Inf. Process. Syst. 34, 16700–16713 (2021).

    Google Scholar 

  119. Johnson, M. J., Duvenaud, D. K., Wiltschko, A., Adams, R. P. & Datta, S. R. Composing graphical models with neural networks for structured representations and fast inference. In Advances in Neural Information Processing Systems Vol. 29 (NIPS, 2016).

  120. Costa, A. C., Ahamed, T. & Stephens, G. J. Adaptive, locally linear models of complex dynamics. Proc. Natl Acad. Sci. USA 116, 1501–1510 (2019).

    Article  ADS  MathSciNet  CAS  PubMed  PubMed Central  Google Scholar 

  121. Krakovna, V. & Doshi-Velez, F. Increasing the interpretability of recurrent neural networks using hidden Markov models. Preprint at https://arxiv.org/abs/1606.05320 (2016).

  122. Mudrik, N., Chen, Y., Yezerets, E., Rozell, C. J. & Charles, A. S. Decomposed linear dynamical systems (dLDS) for learning the latent components of neural dynamics. Preprint at https://arxiv.org/abs/2206.02972 (2022).

  123. Van Den Oord, A. et al. Neural discrete representation learning. In 31st Conference on Neural Information Processing Systems (NIPS, 2017).

  124. Devaraj, C. et al. From symbols to signals: symbolic variational autoencoders. In ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 3317–3321 (IEEE, 2020).

  125. Rasul, K., Park, Y.-J., Ramström, M. N. & Kim, K.-M. VQ-AR: vector quantized autoregressive probabilistic time series forecasting. Preprint at https://arxiv.org/abs/2205.15894 (2022).

  126. Falck, F. et al. Multi-facet clustering variational autoencoders. Adv. Neural Inf. Process. Syst. 34, 8676–8690 (2021).

    Google Scholar 

  127. Fortuin, V., Hüser, M., Locatello, F., Strathmann, H. & Rätsch, G. SOM-VAE: interpretable discrete representation learning on time series. In International Conference on Learning Representations (2018).

  128. Kohonen, T. The self-organizing map. Proc. IEEE 78, 1464–1480 (1990).

    Article  Google Scholar 

  129. Braverman, M. et al. Calibration, entropy rates, and memory in language models. In International Conference on Machine Learning, 1089–1099 (PMLR, 2020).

  130. Tschannen, M., Bachem, O. & Lucic, M. Recent advances in autoencoder-based representation learning. Preprint at https://arxiv.org/abs/1812.05069 (2018).

  131. Jang, E., Gu, S. & Poole, B. Categorical reparameterization with Gumbel-Softmax. In International Conference on Learning Representations (2017).

  132. Funahashi, K.-I. & Nakamura, Y. Approximation of dynamical systems by continuous time recurrent neural networks. Neural Netw. 6, 801–806 (1993).

    Article  Google Scholar 

  133. Neto, J. P., Siegelmann, H. T., Costa, J. F. & Araujo, C. S. Turing universality of neural nets (revisited). In Computer Aided Systems Theory — EUROCAST’97: A Selection of Papers from the 6th International Workshop on Computer Aided Systems Theory Las Palmas de Gran Canaria, Spain, February 24–28, 1997 Proceedings 6, 361–366 (Springer, 1997).

  134. Kaiser, Ł. & Sutskever, I. Neural GPUs learn algorithms. Preprint at https://arxiv.org/abs/1511.08228 (2015).

  135. Weiss, G., Goldberg, Y. & Yahav, E. Learning deterministic weighted automata with queries and counterexamples. In Advances in Neural Information Processing Systems Vol. 32 (NeurIPS, 2019).

  136. Michalenko, J. J. et al. Representing formal languages: a comparison between finite automata and recurrent neural networks. In International Conference on Learning Representations (2019).

  137. Resnick, C., Gupta, A., Foerster, J., Dai, A. M. & Cho, K. Capacity, bandwidth, and compositionality in emergent language learning. Preprint at https://arxiv.org/abs/1910.11424 (2019).

  138. Liu, B., Ash, J. T., Goel, S., Krishnamurthy, A. & Zhang, C. Transformers learn shortcuts to automata. Preprint at https://arxiv.org/abs/2210.10749 (2022).

  139. Tsamoura, E., Hospedales, T. & Michael, L. Neural-symbolic integration: a compositional perspective. In Proceedings of the AAAI Conference on Artificial Intelligence Vol. 35, 5051–5060 (2021).

  140. Daniele, A., Campari, T., Malhotra, S. & Serafini, L. Deep symbolic learning: discovering symbols and rules from perceptions. Preprint at https://arxiv.org/abs/2208.11561 (2022).

  141. Trask, A. et al. Neural arithmetic logic units. In 32nd Conference on Neural Information Processing Systems (NeurIPS, 2018).

  142. Yik, J. et al. Neurobench: advancing neuromorphic computing through collaborative, fair and representative benchmarking. Preprint at https://arxiv.org/abs/2304.04640 (2023).

  143. Neumann, J. V. Theory of Self-Reproducing Automata (ed. Burks, A. W.) (Univ. Illinois Press, 1966).

  144. Wolfram, S. Statistical mechanics of cellular automata. Rev. Mod. Phys. 55, 601 (1983).

    Article  ADS  MathSciNet  Google Scholar 

  145. Gilpin, W. Cellular automata as convolutional neural networks. Phys. Rev. E 100, 032402 (2019).

    Article  ADS  CAS  PubMed  Google Scholar 

  146. Kim, J. Z. & Bassett, D. S. A neural machine code and programming framework for the reservoir computer. Nat. Mach. Intell. 5, 1–9 (2023).

    CAS  Google Scholar 

  147. Wong, F. & Gunawardena, J. Gene regulation in and out of equilibrium. Annu. Rev. Biophys. 49, 199–226 (2020).

    Article  CAS  PubMed  Google Scholar 

  148. Crutchfield, J. P. Between order and chaos. Nat. Phys. 8, 17–24 (2012).

    Article  CAS  Google Scholar 

  149. Ephraim, Y. & Merhav, N. Hidden Markov processes. IEEE Trans. Inf. Theory 48, 1518–1569 (2002).

    Article  MathSciNet  Google Scholar 

  150. Marzen, S. E. & Crutchfield, J. P. Nearly maximally predictive features and their dimensions. Phys. Rev. E 95, 051301 (2017).

    Article  ADS  PubMed  Google Scholar 

  151. Strelioff, C. C. & Crutchfield, J. P. Bayesian structural inference for hidden processes. Phys. Rev. E 89, 042119 (2014).

    Article  ADS  Google Scholar 

  152. Marzen, S. E. & Crutchfield, J. P. Structure and randomness of continuous-time, discrete-event processes. J. Stat. Phys. 169, 303–315 (2017).

    Article  ADS  MathSciNet  Google Scholar 

  153. Pfau, D., Bartlett, N. & Wood, F. Probabilistic deterministic infinite automata. In Advances in Neural Information Processing Systems Vol. 23 (NIPS, 2010).

  154. Battle, C. et al. Broken detailed balance at mesoscopic scales in active biological systems. Science 352, 604–607 (2016).

    Article  ADS  CAS  PubMed  PubMed Central  Google Scholar 

  155. Lucente, D., Baldassarri, A., Puglisi, A., Vulpiani, A. & Viale, M. Inference of time irreversibility from incomplete information: linear systems and its pitfalls. Phys. Rev. Res. 4, 043103 (2022).

    Article  CAS  Google Scholar 

  156. Frishman, A. & Ronceray, P. Learning force fields from stochastic trajectories. Phys. Rev. X 10, 021009 (2020).

    CAS  Google Scholar 

  157. Skinner, D. J. & Dunkel, J. Improved bounds on entropy production in living systems. Proc. Natl Acad. Sci. USA 118, e2024300118 (2021).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  158. Wan, K. Y. & Goldstein, R. E. Time irreversibility and criticality in the motility of a flagellate microorganism. Phys. Rev. Lett. 121, 058103 (2018).

    Article  ADS  PubMed  Google Scholar 

  159. Larson, B. T., Garbus, J., Pollack, J. B. & Marshall, W. F. A unicellular walker controlled by a microtubule-based finite-state machine. Curr. Biol. 32, 3745–3757 (2022).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  160. Lynn, C. W., Cornblath, E. J., Papadopoulos, L., Bertolero, M. A. & Bassett, D. S. Broken detailed balance and entropy production in the human brain. Proc. Natl Acad. Sci. USA 118, e2109889118 (2021).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  161. Martiniani, S., Lemberg, Y., Chaikin, P. M. & Levine, D. Correlation lengths in the language of computable information. Phys. Rev. Lett. 125, 170601 (2020).

    Article  ADS  CAS  PubMed  Google Scholar 

  162. Ro, S. et al. Model-free measurement of local entropy production and extractable work in active matter. Phys. Rev. Lett. 129, 220601 (2022).

    Article  ADS  CAS  PubMed  Google Scholar 

  163. Nardini, C. et al. Entropy production in field theories without time-reversal symmetry: quantifying the non-equilibrium character of active matter. Phys. Rev. X 7, 021007 (2017).

    Google Scholar 

  164. Tkacik, G. & Bialek, W. Information processing in living systems. Annu. Rev. Condens. Matter Phys. 7, 89–117 (2016).

    Article  ADS  Google Scholar 

  165. Lynn, C. W., Holmes, C. M., Bialek, W. & Schwab, D. J. Decomposing the local arrow of time in interacting systems. Phys. Rev. Lett. 129, 118101 (2022).

    Article  ADS  MathSciNet  CAS  PubMed  PubMed Central  Google Scholar 

  166. Bauer, M., Petkova, M. D., Gregor, T., Wieschaus, E. F. & Bialek, W. Trading bits in the readout from a genetic network. Proc. Natl Acad. Sci. USA 118, e2109011118 (2021).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  167. Mattingly, H., Kamino, K., Machta, B. & Emonet, T. Escherichia coli chemotaxis is information limited. Nat. Phys. 17, 1426–1431 (2021).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  168. Landauer, R. Computation: a fundamental physical view. Phys. Scr. 35, 88 (1987).

    Article  ADS  Google Scholar 

  169. Still, S., Sivak, D. A., Bell, A. J. & Crooks, G. E. Thermodynamics of prediction. Phys. Rev. Lett. 109, 120604 (2012).

    Article  ADS  PubMed  Google Scholar 

  170. Adhikari, S., Kabakçıoğlu, A., Strang, A., Yuret, D. & Hinczewski, M. Machine learning in and out of equilibrium. Preprint at https://arxiv.org/abs/2306.03521 (2023).

  171. Li, J., Liu, C.-W. J., Szurek, M. & Fakhri, N. Measuring irreversibility from learned representations of biological patterns. Preprint at https://arxiv.org/abs/2305.19983 (2023).

  172. Ho, J., Jain, A. & Abbeel, P. Denoising diffusion probabilistic models. Adv. Neural Inf. Process. Syst. 33, 6840–6851 (2020).

    Google Scholar 

  173. Campbell, D., Farmer, D., Crutchfield, J. & Jen, E. Experimental mathematics: the role of computation in nonlinear science. Commun. ACM 28, 374–384 (1985).

    Article  MathSciNet  Google Scholar 

  174. Feldman, D. P., McTague, C. S. & Crutchfield, J. P. The organization of intrinsic computation: complexity–entropy diagrams and the diversity of natural information processing. Chaos 18, 043106 (2008).

    Article  ADS  MathSciNet  PubMed  Google Scholar 

  175. Mitchell, M., Crutchfield, J. P. & Hraber, P. T. Dynamics, computation, and the ‘edge of chaos’: a re-examination. In Santa Fe Institute Studies in the Sciences of Complexity Vol. 19, 497–497 (Addison-Wesley Publishing Co, 1994).

  176. Carroll, T. L. Do reservoir computers work best at the edge of chaos? Chaos 30, 121109 (2020).

    Article  ADS  MathSciNet  CAS  PubMed  Google Scholar 

  177. Fajardo-Fontiveros, O. et al. Fundamental limits to learning closed-form mathematical models from data. Nat. Commun. 14, 1043 (2023).

    Article  ADS  CAS  PubMed  PubMed Central  Google Scholar 

  178. Krishnamurthy, K., Can, T. & Schwab, D. J. Theory of gating in recurrent neural networks. Phys. Rev. X 12, 011011 (2022).

    CAS  PubMed  PubMed Central  Google Scholar 

  179. Mikhaeil, J., Monfared, Z. & Durstewitz, D. On the difficulty of learning chaotic dynamics with RNNs. Adv. Neural Inf. Process. Syst. 35, 11297–11312 (2022).

    Google Scholar 

  180. Marzen, S. E., Riechers, P. M. & Crutchfield, J. P. Complexity-calibrated benchmarks for machine learning reveal when next-generation reservoir computer predictions succeed and mislead. Preprint at https://arxiv.org/abs/2303.14553 (2023).

  181. Ding, X., Zou, Z. & Brooks III, C. L. Deciphering protein evolution and fitness landscapes with latent space models. Nat. Commun. 10, 5644 (2019).

    Article  ADS  CAS  PubMed  PubMed Central  Google Scholar 

  182. Huijben, I. A., Nijdam, A. A., Overeem, S., Van Gilst, M. M. & Van Sloun, R. SOM-CPC: unsupervised contrastive learning with self-organizing maps for structured representations of high-rate time series. In International Conference on Machine Learning 14132–14152 (PMLR, 2023).

  183. Kantz, H. & Schreiber, T. Nonlinear Time Series Analysis Vol. 7 (Cambridge Univ. Press, 2004).

  184. Deyle, E. R. & Sugihara, G. Generalized theorems for nonlinear state space reconstruction. PLoS ONE 6, e18295 (2011).

    Article  ADS  CAS  PubMed  PubMed Central  Google Scholar 

  185. Stark, J. Delay embeddings for forced systems. i. Deterministic forcing. J. Nonlinear Sci. 9, 255–332 (1999).

    Article  ADS  MathSciNet  Google Scholar 

  186. Nash, J. The imbedding problem for Riemannian manifolds. Ann. Math. 63, 20–63 (1956).

    Article  MathSciNet  Google Scholar 

  187. Eftekhari, A., Yap, H. L., Wakin, M. B. & Rozell, C. J. Stabilizing embedology: geometry-preserving delay-coordinate maps. Phys. Rev. E 97, 022222 (2018).

    Article  ADS  MathSciNet  CAS  PubMed  Google Scholar 

  188. Grebogi, C., Ott, E. & Yorke, J. A. Unstable periodic orbits and the dimensions of multifractal chaotic attractors. Phys. Rev. A 37, 1711 (1988).

    Article  ADS  MathSciNet  CAS  Google Scholar 

  189. Cvitanović, P. Invariant measurement of strange sets in terms of cycles. Phys. Rev. Lett. 61, 2729 (1988).

    Article  ADS  MathSciNet  PubMed  Google Scholar 

  190. Lai, Y.-C., Nagai, Y. & Grebogi, C. Characterization of the natural measure by unstable periodic orbits in chaotic attractors. Phys. Rev. Lett. 79, 649 (1997).

    Article  ADS  CAS  Google Scholar 

  191. Lathrop, D. P. & Kostelich, E. J. Characterization of an experimental strange attractor by periodic orbits. Phys. Rev. A 40, 4028 (1989).

    Article  ADS  MathSciNet  CAS  Google Scholar 

  192. Yalnız, G., Hof, B. & Budanur, N. B. Coarse graining the state space of a turbulent flow using periodic orbits. Phys. Rev. Lett. 126, 244502 (2021).

    Article  ADS  MathSciNet  PubMed  Google Scholar 

  193. Graham, M. D. & Floryan, D. Exact coherent states and the nonlinear dynamics of wall-bounded turbulent flows. Annu. Rev. Fluid Mech. 53, 227–253 (2021).

    Article  ADS  Google Scholar 

  194. Bramburger, J. J. & Fantuzzi, G. Data-driven discovery of invariant measures. Preprint at https://arxiv.org/abs/2308.15318 (2023).

  195. Crowley, C. J. et al. Turbulence tracks recurrent solutions. Proc. Natl Acad. Sci. USA 119, e2120665119 (2022).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  196. Ahamed, T., Costa, A. C. & Stephens, G. J. Capturing the continuous complexity of behaviour in caenorhabditis elegans. Nat. Phys. 17, 275–283 (2021).

    Article  CAS  Google Scholar 

  197. Foti, N., Xu, J., Laird, D. & Fox, E. Stochastic variational inference for hidden Markov models. In Advances in Neural Information Processing Systems Vol. 27 (NIPS 2014).

  198. Kalman, R. E. A new approach to linear filtering and prediction problems. Trans. ASME D 82, 35–45 (1960).

    Article  MathSciNet  Google Scholar 

  199. Roweis, S. & Ghahramani, Z. A unifying review of linear Gaussian models. Neural Comput. 11, 305–345 (1999).

    Article  CAS  PubMed  Google Scholar 

  200. Goodfellow, I. et al. Generative adversarial nets. In Advances in Neural Information Processing Systems Vol. 27 (NIPS 2014).

  201. Kingma, D. P., Mohamed, S., Jimenez Rezende, D. & Welling, M. Semi-supervised learning with deep generative models. In Proc. 27th International Conference on Neural Information Processing Systems (NeurIPS, 2014).

  202. Tang, B. & Matteson, D. S. Probabilistic transformer for time series analysis. Adv. Neural Inf. Process. Syst. 34, 23592–23608 (2021).

    Google Scholar 

  203. Schreiber, T. Measuring information transfer. Phys. Rev. Lett. 85, 461 (2000).

    Article  ADS  CAS  PubMed  Google Scholar 

  204. Bollt, E. M. Review of chaos communication by feedback control of symbolic dynamics. Int. J. Bifurcat. Chaos 13, 269–285 (2003).

    Article  MathSciNet  Google Scholar 

  205. Baptista, M. & Kurths, J. Chaotic channel. Phys. Rev. E 72, 045202 (2005).

    Article  ADS  CAS  Google Scholar 

  206. Lu, Z. & Bassett, D. S. Invertible generalized synchronization: a putative mechanism for implicit learning in neural systems. Chaos 30, 063133 (2020).

    Article  ADS  MathSciNet  PubMed  Google Scholar 

  207. Pathak, J., Hunt, B., Girvan, M., Lu, Z. & Ott, E. Model-free prediction of large spatiotemporally chaotic systems from data: a reservoir computing approach. Phys. Rev. Lett. 120, 024102 (2018).

    Article  ADS  CAS  PubMed  Google Scholar 

  208. Gilpin, W. Chaos as an interpretable benchmark for forecasting and data-driven modelling. In 35th Conference on Neural Information Processing Systems (NeurIPS, 2021).

Download references

Acknowledgements

The author thanks H. Abarbanel and H. Swinney for informative discussions. This project has been made possible in part by grant number DAF2023-329596 from the Chan Zuckerberg Initiative DAF, an advised fund of Silicon Valley Community Foundation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to William Gilpin.

Ethics declarations

Competing interests

The author declares no competing interests.

Peer review

Peer review information

Nature Reviews Physics thanks the anonymous reviewers for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gilpin, W. Generative learning for nonlinear dynamics. Nat Rev Phys 6, 194–206 (2024). https://doi.org/10.1038/s42254-024-00688-2

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s42254-024-00688-2

Search

Quick links

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics