Abstract
Neural operators map multiple functions to different functions, possibly in different spaces, unlike standard neural networks. Hence, neural operators allow the solution of parametric ordinary differential equations (ODEs) and partial differential equations (PDEs) for a distribution of boundary or initial conditions and excitations, but can also be used for system identification as well as designing various components of digital twins. We introduce the Laplace neural operator (LNO), which incorporates the pole–residue relationship between input–output spaces, leading to better interpretability and generalization for certain classes of problems. The LNO is capable of processing non-periodic signals and transient responses resulting from simultaneously zero and non-zero initial conditions, which makes it achieve better approximation accuracy over other neural operators for extrapolation circumstances in solving several ODEs and PDEs. We also highlight the LNO’s good interpolation ability, from a low-resolution input to high-resolution outputs at arbitrary locations within the domain. To demonstrate the scalability of LNO, we conduct large-scale simulations of Rossby waves around the globe, employing millions of degrees of freedom. Taken together, our findings show that a pretrained LNO model offers an effective real-time solution for general ODEs and PDEs at scale and is an efficient alternative to existing neural operators.
This is a preview of subscription content, access via your institution
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$29.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 digital issues and online access to articles
$119.00 per year
only $9.92 per issue
Buy this article
- Purchase on SpringerLink
- Instant access to full article PDF
Prices may be subject to local taxes which are calculated during checkout
Similar content being viewed by others
Data availability
The dataset generation scripts used for the problems studied in this work are available in a publicly available GitHub repository39.
Code availability
The code used in this study is released in a publicly available GitHub repository39.
References
Lu, L., Jin, P., Pang, G., Zhang, Z. & Karniadakis, G. E. Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators. Nature Mach. Intell. 3, 218–229 (2021).
Li, Z. et al. Fourier neural operator for parametric partial differential equations. In Proc. 2021 International Conference on Learning Representation (ICLR, 2021).
Lu, L. et al. A comprehensive and fair comparison of two neural operators (with practical extensions) based on fair data. Comput. Methods Appl. Mech. Eng. 393, 114778 (2022).
Kontolati, K., Goswami, S., Shields, M. D. & Karniadakis, G. E. On the influence of over-parameterization in manifold based surrogates and deep neural operators. J. Comput. Phys. 479, 112008 (2023).
Cao, Q., Goswami, S., Tripura, T., Chakraborty, S. & Karniadakis, G. E. Deep neural operators can predict the real-time response of floating offshore structures under irregular waves. Comput. Struct. 291, 107228 (2024).
Oommen, V., Shukla, K., Goswami, S., Dingreville, R. & Karniadakis, G. E. Learning two-phase microstructure evolution using neural operators and autoencoder architectures. npj Comput. Mater. 8, 190 (2022).
Goswami, S. et al. Neural operator learning of heterogeneous mechanobiological insults contributing to aortic aneurysms. J. R. Soc. Interface 19, 20220410 (2022).
Li, Z. et al. Neural operator: graph kernel network for partial differential equations. Preprint at arXiv:2003.03485 (2020).
Kovachki, N. et al. Neural operator: Learning maps between function spaces with applications to pdes. J. Mach. Learn. Res 24, 1–97 (2023).
Li, Z. et al. Multipole graph neural operator for parametric partial differential equations. Adv. Neural. Inf. Process. Syst. 33, 6755–6766 (2020).
Tripura, T. & Chakraborty, S. Wavelet neural operator for solving parametric partial differential equations in computational mechanics problems. Comput. Methods Appl. Mech. Eng. 404, 115783 (2023).
Bonev, B. et al. Spherical fourier neural operators: learning stable dynamics on the sphere. In International Conference on Machine Learning (ed. Lawrence, N.) 2806–2823 (PMLR, 2023).
Borrel-Jensen, N., Goswami, S., Engsig-Karup, A. P., Karniadakis, G. E. & Jeong, C. H. Sound propagation in realistic interactive 3D scenes with parameterized sources using deep neural operators. Proc. Natl Acad. Sci. USA 121, e2312159120 (2024).
Maust, H. et al. Fourier continuation for exact derivative computation in physics-informed neural operators. Preprint at arXiv:2211.15960 (2022).
Li, Z. et al. Learning dissipative dynamics in chaotic systems. In Proc. 36th Conference on Neural Information Processing Systems 1220 (Curran Associates, 2022).
Wen, G., Li, Z., Azizzadenesheli, K., Anandkumar, A. & Benson, S. M. U-FNO—an enhanced Fourier neural operator-based deep-learning model for multiphase flow. Adv. Water Res. 163, 104180 (2022).
Jiang, Z. et al. Fourier-MIONet: Fourier-enhanced multiple-input neural operators for multiphase modeling of geological carbon sequestration. Preprint at arXiv:2303.04778 (2023).
Gupta, J. K. & Brandstetter, J. Towards multi-spatiotemporal-scale generalized pde modeling. TMLR https://openreview.net/forum?id=dPSTDbGtBY (2023).
Raonic, B., Molinaro, R., Rohner, T., Mishra, S. & de Bezenac, E. Convolutional neural operators. In ICLR 2023 Workshop on Physics for Machine Learning (2023).
Bartolucci, F. et al. Are neural operators really neural operators? Frame theory meets operator learning. SAM Research Report (ETH Zuric, 2023).
Deka, S. A. & Dimarogonas, D. V. Supervised learning of Lyapunov functions using Laplace averages of approximate Koopman eigenfunctions. IEEE Control Syst. Lett. 7, 3072–3077 (2023).
Mohr, R. & Mezić, I. Construction of eigenfunctions for scalar-type operators via Laplace averages with connections to the Koopman operator. Preprint at arXiv:1403.6559 (2014).
Brunton, S. L., Budišić, M., Kaiser, E. & Kutz, J. N. Modern Koopman theory for dynamical systems. SIAM Review 64, 229–340 (2021).
Bevanda, P., Sosnowski, S. & Hirche, S. Koopman operator dynamical models: Learning, analysis and control. Ann. Rev. Control 52, 197–212 (2021).
Lin, Y. K. M. Probabilistic Theory of Structural Dynamics. (Krieger Publishing Company, 1967).
Kreyszig, E. Advanced Engineering Mathematics Vol. 334 (John Wiley & Sons, 1972).
Hu, S. L. J., Yang, W. L. & Li, H. J. Signal decomposition and reconstruction using complex exponential models. Mech. Syst. Signal Process. 40, 421–438 (2013).
Hu, S. L. J., Liu, F., Gao, B. & Li, H. Pole-residue method for numerical dynamic analysis. J. Eng. Mech. 142, 04016045 (2016).
Cho, K. et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation. In Proc. 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP) 1724–1734 (2014).
Ronneberger, O., Fischer, P. & Brox, T. U-net: convolutional networks for biomedical image segmentation. In Medical Image Computing and Computer-assisted Intervention – MICCAI 2015: 18th International Conference Part III (eds Navab, N. et al.) 234–241 (Springer International, 2015).
Paszke, A. et al. Pytorch: an imperative style, high-performance deep learning library. Adv. Neural. Inf. Process. Syst. 32, 1–12 (2019).
Agarwal, A. et al. Tensorflow: a system for large-scale machine learning. In Proc. of the 12th USENIX Conference on Operating Systems Design and Implementation 265–283 (USENIX Association, 2016).
Shi, Y. Analysis on Averaging Lorenz System and its Application to Climate. Doctoral dissertation, Univ. of Minnesota (2021).
Ahmed, N., Rafiq, M., Rehman, M. A., Iqbal, M. S. & Ali, M. Numerical modeling of three dimensional Brusselator reaction diffusion system. AIP Adv. 9, 015205 (2019).
Xu, Y., Ma, J., Wang, H., Li, Y. & Kurths, J. Effects of combined harmonic and random excitations on a Brusselator model. Eur. Phys. J. B 90, 194 (2017).
Behrens, J. Atmospheric and ocean modeling with an adaptive finite element solver for the shallow-water equations. Appl. Numer. Math. 26, 217–226 (1998).
Kontolati, K., Goswami, S., Karniadakis, G. E. & Shields, M. D. Learning in latent spaces improves the predictive accuracy of deep neural operators. Preprint at arXiv:2304.07599 (2023).
Cao, Q., James Hu, S. L. & Li, H. Laplace-and frequency-domain methods on computing transient responses of oscillators with hysteretic dampings to deterministic loading. J. Eng. Mech. 149, 04023005 (2023).
Cao, Q., Goswami, S. & Karniadakis, G. E. Code and data for Laplace neural operator for solving differential equations. Zenodo https://doi.org/10.5281/zenodo.11002002 (2024).
Acknowledgements
Q.C. acknowledges funding from the Dalian University of Technology for visiting Brown University, USA. Q.C., S.G. and G.E.K. acknowledge support by the DOE SEA-CROGS project (DE-SC0023191), the MURI-AFOSR FA9550-20-1-0358 project and the ONR Vannevar Bush Faculty Fellowship (N00014-22-1-2795). All authors acknowledge the computing support provided by the computational resources and services at the Center for Computation and Visualization (CCV), Brown University, where experiments were conducted.
Author information
Authors and Affiliations
Contributions
Q.C. and G.E.K. were responsible for conceptualization. Q.C., S.G. and G.E.K. were responsible for data curation. Q.C. and S.G. were responsible for formal analysis, investigation, software, validation and visualization. Q.C. was responsible for the methodology. G.E.K. was responsible for funding acquisition, project administration, resources and supervision. All authors wrote the original draft, and reviewed and edited the manuscript.
Corresponding author
Ethics declarations
Competing interests
G.E.K. is one of the founders of Phinyx AI, a private start-up company developing AI software products for engineering. The remaining authors (Q.C. and S.G.) declare no competing interests.
Peer review
Peer review information
Nature Machine Intelligence thanks David Ruhe and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Supplementary Information
Additional results as well as details of the network architecture used in this work.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Cao, Q., Goswami, S. & Karniadakis, G.E. Laplace neural operator for solving differential equations. Nat Mach Intell 6, 631–640 (2024). https://doi.org/10.1038/s42256-024-00844-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1038/s42256-024-00844-4
This article is cited by
-
Rethinking materials simulations: Blending direct numerical simulations with neural operators
npj Computational Materials (2024)
-
Deep learning in computational mechanics: a review
Computational Mechanics (2024)