Deep learning for universal linear embeddings of nonlinear dynamics

Identifying coordinate transformations that make strongly nonlinear dynamics approximately linear has the potential to enable nonlinear prediction, estimation, and control using linear theory. The Koopman operator is a leading data-driven embedding, and its eigenfunctions provide intrinsic coordinates that globally linearize the dynamics. However, identifying and representing these eigenfunctions has proven challenging. This work leverages deep learning to discover representations of Koopman eigenfunctions from data. Our network is parsimonious and interpretable by construction, embedding the dynamics on a low-dimensional manifold. We identify nonlinear coordinates on which the dynamics are globally linear using a modified auto-encoder. We also generalize Koopman representations to include a ubiquitous class of systems with continuous spectra. Our framework parametrizes the continuous frequency using an auxiliary network, enabling a compact and efficient embedding, while connecting our models to decades of asymptotics. Thus, we benefit from the power of deep learning, while retaining the physical interpretability of Koopman embeddings.


Discrete spectrum example
Supplementary Figure 1 shows the average prediction error versus the number of prediction steps. Even for a large number of steps, the error is quite small, giving good prediction. This figure also demonstrates prediction performance on example trajectories. The eigenfunctions for this example are shown in Supplementary Figure 2. We see that one is quadratic and the other is linear. This is expected because we can analytically derive that y 1 = x 1 and y 2 = x 2 − bx 2 1 is a pair of eigenfunctions for this system, where b = −λ 2µ−λ . When the eigenvalues are allowed to vary with the auxiliary network used for continuous spectrum systems, the eigenvalues remain relatively constant, near the true values of −0.05 and −1, as shown in Supplementary Figure 3.

Nonlinear pendulum
The nonlinear pendulum is one of the simplest examples that exhibits a continuous eigenvalue spectrum. Using the auxiliary network, we allow the frequency ω of the Koopman eigenvalues to vary continuously with the embedded coordinates y 1 and y 2 , as shown in Supplementary Figure 4. The frequency ω varies smoothly with the radius y 2 1 + y 2 2 , from around −0.95 to −0.4 as the energy is increased. When the damping rate is also allowed to vary continuously, it remains nearly constant around the value of µ = 0, since the system is conservative.

Fluid flow on attractor
For the final example, we consider the nonlinear fluid vortex shedding behind a cylinder. We begin by considering dynamics on the attracting manifold. When we train the network with trajectories on the slow manifold, we are able to identify a single conjugate eigenfunction pair, shown in Supplementary Figure 6.

Fluid flow off attractor
We now consider the case where we train a network using trajectories that start off of the attracting slow manifold. Supplementary Figure 7 shows the average prediction error versus the number of steps. Although the loss function only penalized 30 prediction steps in the future (S p = 30), the error remains small for all 100 steps. The figure also shows the embedding of a trajectory in y coordinates. Although this network's training data includes data off the attractor, this network's embedding is similar to the embedding from the previous case. (See Figure 5 in the main paper.) The eigenfunctions are shown in Supplementary Figure 8, where it can be seen that the mode shapes match those in the on-attractor data in Supplementary Figure 6.
Supplementary Figure 5: Continuous eigenvalues as a function of y 1 and y 2 . Note that the frequency ω ≈ −1. The parameter µ shows growth inside the limit cycle (marked in red) and decay outside the limit cycle. The continuously varying eigenvalues are shown in Supplementary Figure 9. Again, similar to the on-attractor case, the damping µ varies considerably with radius, while the frequency is very nearly a constant −1. Supplementary Figure 9: Parameter variations for the complex eigenvalues in terms of y 1 and y 2 . Note that this is a natural extension of Supplementary Figure 5, which is limited to data on the bowl.