Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Article
  • Published:

Multiscale simulations of complex systems by learning their effective dynamics

A preprint version of the article is available at arXiv.

Abstract

Predictive simulations of complex systems are essential for applications ranging from weather forecasting to drug design. The veracity of these predictions hinges on their capacity to capture effective system dynamics. Massively parallel simulations predict the system dynamics by resolving all spatiotemporal scales, often at a cost that prevents experimentation, while their findings may not allow for generalization. On the other hand, reduced-order models are fast but limited by the frequently adopted linearization of the system dynamics and the utilization of heuristic closures. Here we present a novel systematic framework that bridges large-scale simulations and reduced-order models to learn the effective dynamics of diverse, complex systems. The framework forms algorithmic alloys between nonlinear machine learning algorithms and the equation-free approach for modelling complex systems. Learning the effective dynamics deploys autoencoders to formulate a mapping between fine- and coarse-grained representations and evolves the latent space dynamics using recurrent neural networks. The algorithm is validated on benchmark problems, and we find that it outperforms state-of-the-art reduced-order models in terms of predictability, and large-scale simulations in terms of cost. Learning the effective dynamics is applicable to systems ranging from chemistry to fluid mechanics and reduces the computational effort by up to two orders of magnitude while maintaining the prediction accuracy of the full system dynamics. We argue that learning the effective dynamics provides a potent novel modality for accurately predicting complex systems.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Multiscale-LED.
Fig. 2: FitzHugh–Nagumo model.
Fig. 3: Kuramoto-Sivashinsky equation.
Fig. 4: Iterative latent forecasting in flow past a cylinder.
Fig. 5: Multiscale LED in flow past a cylinder.

Similar content being viewed by others

Data availability

All the data analysed in this paper were produced with open-source software described in the code availability statement. Reference data and the scripts used to produce the data figures, as well as instructions to launch training and inference (evaluation of trained models) scripts for LED, are available on the GitHub repository: https://github.com/cselab/LED.

Code availability

Simulations of the KS equation have been performed with a spectral fourth-order solver for stiff PDEs developed in Python. Simulation of the FHN equation has been performed with an LB method developed in Python. The LED software is implemented in Python, utilizing the PyTorch library for the neural networks. All codes are made readily available in the GitHub repository: https://github.com/cselab/LED. Direct numerical simulations were performed with the flow solver CubismUP 2D: https://github.com/cselab/CubismUP_2D.

References

  1. Rackovsky, S. & Scheraga, H. A. The structure of protein dynamic space. Proc. Natl Acad. Sci. USA 117, 19938–19942 (2020).

    Article  Google Scholar 

  2. Gilmour, D., Rembold, M. & Leptin, M. From morphogen to morphogenesis and back. Nature 541, 311–320 (2017).

    Article  Google Scholar 

  3. Robinson, P. A., Rennie, C. J., Rowe, D. L., O’Connor, S. C. & Gordon, E. Multiscale brain modelling. Philos. Trans. R. Soc. B 360, 1043–1050 (2005).

    Article  Google Scholar 

  4. Council, N. R. A National Strategy for Advancing Climate Modeling (National Academies Press, 2012).

  5. Mahadevan, A. The impact of submesoscale physics on primary productivity of plankton. Annu. Rev. Mar. Sci. 8, 161–184 (2016).

    Article  Google Scholar 

  6. Bellomo, N. & Dogbe, C. On the modeling of traffic and crowds: a survey of models, speculations, and perspectives. SIAM Rev. 53, 409–463 (2011).

    Article  MathSciNet  MATH  Google Scholar 

  7. Lee, E. H., Hsin, J., Sotomayor, M., Comellas, G. & Schulten, K. Discovery through the computational microscope. Structure 17, 1295–1306 (2009).

    Article  Google Scholar 

  8. Springel, V. et al. Simulations of the formation, evolution and clustering of galaxies and quasars. Nature 435, 629–636 (2005).

    Article  Google Scholar 

  9. Car, R. & Parrinello, M. Unified approach for molecular dynamics and density-functional theory. Phys. Rev. Lett. 55, 2471–2474 (1985).

    Article  Google Scholar 

  10. Kevrekidis, I. G. et al. Equation-free, coarse-grained multiscale computation: enabling microscopic simulators to perform system-level analysis. Commun. Math. Sci. 1, 715–762 (2003).

    Article  MathSciNet  MATH  Google Scholar 

  11. Weinan, E. & Engquist, B. et al. The heterogeneous multiscale methods. Commun. Math. Sci. 1, 87–132 (2003).

    Article  MathSciNet  MATH  Google Scholar 

  12. Kevrekidis, I. G., Gear, C. W. & Hummer, G. Equation-free: the computer-aided analysis of complex multiscale systems. AIChE J. 50, 1346–1355 (2004).

    Article  Google Scholar 

  13. Laing, C. R., Frewen, T. & Kevrekidis, I. G. Reduced models for binocular rivalry. J. Comput. Neurosci. 28, 459–476 (2010).

    Article  MathSciNet  Google Scholar 

  14. Bar-Sinai, Y., Hoyer, S., Hickey, J. & Brenner, M. P. Learning data-driven discretizations for partial differential equations. Proc. Natl Acad. Sci. USA 116, 15344–15349 (2019).

    Article  MathSciNet  MATH  Google Scholar 

  15. Weinan, E., Li, X. & Vanden-Eijnden, E. in Multiscale Modelling and Simulation (eds Attinger, S. & Koumoutsakos, P.) 3–21 (Springer, 2004).

  16. Weinan, E., Engquist, B., Li, X., Ren, W. & Vanden-Eijnden, E. Heterogeneous multiscale methods: a review. Commun. Comput. Phys. 2, 367–450 (2007).

    MathSciNet  MATH  Google Scholar 

  17. Tao, M., Owhadi, H. & Marsden, J. E. Nonintrusive and structure preserving multiscale integration of stiff ODEs, SDEs, and Hamiltonian systems with hidden slow dynamics via flow averaging. Multiscale Model. Simul. 8, 1269–1324 (2010).

    Article  MathSciNet  MATH  Google Scholar 

  18. Linot, A. J. & Graham, M. D. Deep learning to discover and predict dynamics on an inertial manifold. Phys. Rev. E 101, 062209 (2020).

    Article  MathSciNet  Google Scholar 

  19. Robinson, J. C. Inertial manifolds for the Kuramoto–Sivashinsky equation. Phys. Lett. A 184, 190–193 (1994).

    Article  MathSciNet  MATH  Google Scholar 

  20. Jumper, J. et al. Highly accurate protein structure prediction with AlphaFold. Nature 596, 583–589 (2021).

    Article  Google Scholar 

  21. Brunton, S. L., Noack, B. R. & Koumoutsakos, P. Machine learning for fluid mechanics. Annu. Rev. Fluid Mech. 52, 477–508 (2019).

    Article  MATH  Google Scholar 

  22. Lusch, B., Kutz, J. N. & Brunton, S. L. Deep learning for universal linear embeddings of nonlinear dynamics. Nat. Commun. 9, 4950 (2018).

    Article  Google Scholar 

  23. Geneva, N. & Zabaras, N. Modeling the dynamics of PDE systems with physics-constrained deep auto-regressive networks. J. Comput. Phys. 403, 109056 (2020).

    Article  MathSciNet  MATH  Google Scholar 

  24. Milano, M. & Koumoutsakos, P. Neural network modeling for near wall turbulent flow. J. Comput. Phys. 182, 1–26 (2002).

    Article  MATH  Google Scholar 

  25. Wehmeyer, C. & Noé, F. Time-lagged autoencoders: deep learning of slow collective variables for molecular kinetics. J. Chem. Phys. 148, 241703 (2018).

    Article  Google Scholar 

  26. Bhatia, H. et al. Machine-learning-based dynamic-importance sampling for adaptive multiscale simulations. Nat. Mach. Intell. 3, 401–409 (2021).

    Article  Google Scholar 

  27. Chung, J. et al. A recurrent latent variable model for sequential data. Adv. Neural Inf. Process. Syst. 28, 2980–2988 (2015).

    Google Scholar 

  28. Vlachas, P. R. et al. Backpropagation algorithms and reservoir computing in recurrent neural networks for the forecasting of complex spatiotemporal dynamics. Neural Netw. 126, 191–217 (2020).

    Article  Google Scholar 

  29. Gonzalez, F. J. & Balajewicz, M. Deep convolutional recurrent autoencoders for learning low-dimensional feature dynamics of fluid systems. Preprint at https://arxiv.org/abs/1808.01346 (2018).

  30. Maulik, R., Lusch, B. & Balaprakash, P. Reduced-order modeling of advection-dominated systems with recurrent neural networks and convolutional autoencoders. Phys. Fluids 33, 037106 (2021).

    Article  Google Scholar 

  31. Hasegawa, K., Fukami, K., Murata, T. & Fukagata, K. Machine-learning-based reduced-order modeling for unsteady flows around bluff bodies of various shapes. Theor. Comput. Fluid Dyn. 34, 367–383 (2020).

    Article  MathSciNet  Google Scholar 

  32. Lee, S., Kooshkbaghi, M., Spiliotis, K., Siettos, C. I. & Kevrekidis, I. G. Coarse-scale PDEs from fine-scale observations via machine learning. Chaos 30, 013141 (2020).

    Article  MathSciNet  Google Scholar 

  33. LeCun, Y., Bengio, Y. & Hinton, G. Deep learning. Nature 521, 436–444 (2015).

    Article  Google Scholar 

  34. Bishop, C. M. Mixture Density Networks Technical Report NCRG/97/004 (Neural Computing Research Group, Aston University, 1994).

  35. Hochreiter, S. & Schmidhuber, J. Long short-term memory. Neural Comput. 9, 1735–1780 (1997).

    Article  Google Scholar 

  36. Werbos, P. J. Generalization of backpropagation with application to a recurrent gas market model. Neural Netw. 1, 339–356 (1988).

    Article  Google Scholar 

  37. Hernández, C. X., Wayment-Steele, H. K., Sultan, M. M., Husic, B. E. & Pande, V. S. Variational encoding of complex dynamics. Phys. Rev. E 97, 062412 (2018).

    Article  Google Scholar 

  38. Sultan, M. M., Wayment-Steele, H. K. & Pande, V. S. Transferable neural networks for enhanced sampling of protein dynamics. J. Chem. Theory Comput. 14, 1887–1894 (2018).

    Article  Google Scholar 

  39. Kingma, D. P. & Ba, J. Adam: a method for stochastic optimization. In 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7–9, 2015, Conference Track Proc. (eds Bengio, Y. & LeCun, Y.) 1-15 (2015).

  40. Vlachas, P. R., Zavadlav, J., Praprotnik, M. & Koumoutsakos, P. Accelerated simulations of molecular systems through learning of their effective dynamics. J. Chem. Theory Comput. 18, 538–549 (2021).

    Article  Google Scholar 

  41. FitzHugh, R. Impulses and physiological states in theoretical models of nerve membrane. Biophys. J. 1, 445 (1961).

    Article  Google Scholar 

  42. Nagumo, J., Arimoto, S. & Yoshizawa, S. An active pulse transmission line simulating nerve axon. Proc. IRE 50, 2061–2070 (1962).

    Article  Google Scholar 

  43. Karlin, I. V., Ansumali, S., Frouzakis, C. E. & Chikatamarla, S. S. Elements of the lattice Boltzmann method I: Linear advection equation. Commun. Comput. Phys. 1, 616–655 (2006).

    MATH  Google Scholar 

  44. Pathak, J., Hunt, B., Girvan, M., Lu, Z. & Ott, E. Model-free prediction of large spatiotemporally chaotic systems from data: a reservoir computing approach. Phys. Rev. Lett.120, 024102 (2018).

    Article  Google Scholar 

  45. Brunton, S. L., Proctor, J. L. & Kutz, J. N. Discovering governing equations from data by sparse identification of nonlinear dynamical systems. Proc. Natl Acad. Sci. USA 113, 3932–3937 (2016).

    Article  MathSciNet  MATH  Google Scholar 

  46. Kuramoto, Y. Diffusion-induced chaos in reaction systems. Prog. Theor. Phys. Suppl. 64, 346–367 (1978).

    Article  Google Scholar 

  47. Sivashinsky, G. I. Nonlinear analysis of hydrodynamic instability in laminar flames—I. Derivation of basic equations. Acta Astronaut. 4, 1177–1206 (1977).

    Article  MathSciNet  MATH  Google Scholar 

  48. Cvitanović, P., Davidchack, R. L. & Siminos, E. On the state space geometry of the Kuramoto–Sivashinsky flow in a periodic domain. SIAM J. Appl. Dyn. Syst. 9, 1–33 (2010).

    Article  MathSciNet  MATH  Google Scholar 

  49. Kassam, A. & Trefethen, L. Fourth-order time-stepping for stiff PDEs. SIAM J. Sci. Comput. 26, 1214–1233 (2005).

    Article  MathSciNet  MATH  Google Scholar 

  50. Zdravkovich, M. Flow Around Circular Cylinders Volume 1: Fundamentals (Oxford University Press, 1997).

  51. Rossinelli, D. et al. MRAG-I2D: multi-resolution adapted grids for remeshed vortex methods on multicore architectures. J. Comput. Phys. 288, 1–18 (2015).

    Article  MathSciNet  MATH  Google Scholar 

  52. Bost, C., Cottet, G.-H. & Maitre, E. Convergence analysis of a penalization method for the three-dimensional motion of a rigid body in an incompressible viscous fluid. SIAM J. Numer. Anal. 48, 1313–1337 (2010).

    Article  MathSciNet  MATH  Google Scholar 

  53. Taira, K. et al. Modal analysis of fluid flows: applications and outlook. AIAA J. 58, 998–1022 (2020).

    Article  Google Scholar 

Download references

Acknowledgements

We thank N.Kallikounis (ETH Zurich) for helpful discussions on the LB method, P. Weber and M. Chatzimanolakis (ETH Zurich) for help with the simulations of the flow past a cylinder and Y. Kevrekidis (Johns Hopkins University) and K. Spiliotis (University of Rostock) for providing code to reproduce data for the FHN equation. We acknowledge the infrastructure and support of CSCS, providing the necessary computational resources under project s929.

Author information

Authors and Affiliations

Authors

Contributions

P.K. conceived the project; P.R.V., G.A., C.U. and P.K. designed and performed research; P.R.V. and G.A. contributed new analytic tools; P.R.V., G.A. and P.K. analysed data and P.R.V., G.A. and P.K. wrote the paper.

Corresponding author

Correspondence to Petros Koumoutsakos.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Machine Intelligence thanks Harsh Bhatia and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

Supplementary Methods E and F, Comparison Measures A–D, Results A–E, Figs. 1–14 and Tables 1–17.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Vlachas, P.R., Arampatzis, G., Uhler, C. et al. Multiscale simulations of complex systems by learning their effective dynamics. Nat Mach Intell 4, 359–366 (2022). https://doi.org/10.1038/s42256-022-00464-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s42256-022-00464-w

This article is cited by

Search

Quick links

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics