Drawing on ergodic theory, we introduce a novel training method for machine learning based forecasting methods for chaotic dynamical systems. The training enforces dynamical invariants—such as the Lyapunov exponent spectrum and the fractal dimension—in the systems of interest, enabling longer and more stable forecasts when operating with limited data. The technique is demonstrated in detail using reservoir computing, a specific kind of recurrent neural network. Results are given for the Lorenz 1996 chaotic dynamical system and a spectral quasi-geostrophic model of the atmosphere, both typical test cases for numerical weather prediction.
Publication - Journal Article Constraining chaos: Enforcing dynamical invariants in the training of reservoir computers
Chaos: An Interdisciplinary Journal of Nonlinear Science, vol. 33, iss. 10, 2023
Jason A. Platt, Stephen G. Penny, Timothy A. Smith, Tse-Chun Chen, Henry D. I. Abarbanel