Scientific and physics-aware machine learning, and data assimilation
  • Luca Magri
    • Group
    • Collaborations
  • Publications
  • Research
    • Overview
    • Scientific machine learning >
      • Physics-aware machine learning
      • Chaotic time series forecasting
      • Nonlinear model reduction
      • Super-resolution and reconstruction
    • Real-time digital twins and data assimilation >
      • Inferring unknown unknowns: Bias-aware data assimilation
    • Optimization >
      • Bayesian optimisation
      • Chaotic systems
    • Mathematical modelling of multi-physics fluids >
      • Reacting flows and sound
    • Quantum computing and machine learning >
      • Solving nonlinear equations with quantum algorithms
      • Linear methods from quantum mechanics
    • Data and codes
  • Jobs/grants
  • Outreach
    • Research Centre in Data-Driven Engineering
    • Data-driven methods, machine learning and optimization
    • Data-driven Dynamical Systems Analysis
  • Consultancy
  • Teaching
    • University modules
    • Artificial intelligence for engineering
    • Mathematical methods
    • Misc
  • Contact

Chaotic time series forecasting

What is a chaotic time series forecasting?
Chaotic dynamics, arising in many fields from fluid dynamics to climate modeling, typically exhibit complex and unpredictable behaviours. This is due to the sensitivity of the systems to small changes, making it difficult to predict accurately in time. Chaotic time series forecasting adresses this by combining data-driven tools with mathematical modeling. 
Chaotic time series forecasting - combining dynamical systems and machine learning
A dynamical system is a mathematical model of a time-dependent process, which is dependent on the dynamics of the system and the initial conditions. When a system is chaotic, it displays an exponential sensitivity to initial conditions and pertubations. This means that a small error in prediction will increase exponentially in time, making the forecast of chaotic system highly complex. The idea that small changes have large, nonlinear effects is also commonly referred to as butterfly effect.

Precisely, it is given by

d d t x(t) = F ( x(t) )

where x(t) is the state of a physical system over time t and F(·) is referred to as dynamics.To address the complexity in prediction, machine learning methods have been integrated into the forecasting process. Recurrent Neural Networks (RNNs) are a class of artificial neural networks designed to process sequential data, making them well-suited for time series modeling and forecasting. They have the capacity to capture complex temporal dependencies and patterns. By training the network on available data, they learn the underlying dynamics of the system, enabling them to make predictions about future states. ​ By combining chaotic systems understanding and machine learning, we can gain insight into complex dynamics from experiments and observations, and predict future states when traditional methods fail.

Picture
The evolution of the Lorenz-63 system predicted by a long short-term memory (LSTM) network.
An example of practical application: Prediction of extreme events
One application of chaotic time series forecasting is the prediction of extreme events.
​Extreme events are sudden large-amplitude changes in the state or observables, which have typically negative consequences on the system of interest. The events appear in many nonlinear scientific phenomena, such as rogue waves, weather patterns and power grid shocks. Here, we focus on extreme events in a chaotic (turbulent) fluid mechanics, which characterise multiple systems of engineering interest. 
Picture
Extreme event in a chaotic flow. The flow starts from a condition characterized by low-amplitude oscillations, shows violent high-amplitude changes in the velocity (arrows) and vorticity (surfaces), and then reverts back to the low-ampltude regime.

To predict the extreme events, we use echo state networks (ESNs). The ESNs are recurrent neural networks, which are trained on past time series data to predict the future of the system given current measurements. In doing so, they are able to tell us at current time whether an event is happening in the near future. Based on the prediction of the event, we can act on the system in advance to prevent the event from happening, mitigating the negative impacts of the events on the system.
Picture
Predicted extreme event. The ESN continuously monitors the system, and correctly predicts the incoming event 6.5 Lyapunov times in advance.
Picture
Time series for the controlled system. By using the ESN predictions we act in advance and avoid the extreme events.

In addition to short-term predictions, we study extreme events also in a statistically sense. By analysing the statistics, we are able to compute the probability of an event happening at any given time. The problem with analysing the statistics of extreme events is that the events are also typically rare, which means that we usually do not have enough data to accurately estimate their probability.
In this scenario, we make long-term predictions with ESNs to extrapolate the statistics and improve the statistical knowledge of the system with respect to the available data for training.
Picture
PDF of the kinetic energy, characterised by the long tail of large amplitude events. The ESN improves the orediction of the extreme statistics with respect to the avaialble data (Train).

 ​Material
​This work was part of the PhD of Alberto Racca. It is published in
Data-driven prediction and control of extreme events in a chaotic flow, A.Racca and L.Magri, PRF (2022).
The code used for this project is publicly available on GitHub.
Research funded by EPSRC, Cambridge Trust and ERC grants.
An example of practical application: Stability analysis from data 
Picture
The first 35 Lyapunov exponents of the Kuramoto–Sivashinsky system for the reference data (black squares), PI-LSTM (red dots) and LH-LSTM (blue crosses).
Picture
The angle distribution of the Kuramoto–Sivashinsky system for the leading covariant Lyapunov vectors of three different subspaces: (a) unstable-neutral, (b) unstable-stable, and (c) neutral-stable. The black line corresponds to the reference data, the red and blue lines indicate the results obtained from the 10000 LT long autonomous evolution of the PI-LSTM and LH-LSTM models, respectively.
Another example of chaotic time series forecasting is in the stability analysis of chaotic systems from data. 

The predictability and stability of a chaotic system are characterized by its tangent space, which can be computed using the linearized dynamics provided by the Jacobian. This computation allows for the derivation of quantities such as the Lyapunov exponents (LEs), which measure the exponential rate of separation of trajectories. A geometric characterization is provided by the covariant Lyapunov vectors (CLVs), which constitute a covariant basis of the tangent space, and point to directions of asymptotic expansion and contraction of the dynamical system. 
Preserving these stability properties is crucial when building surrogate models from limited observations to a more comprehensive dataset.

To ensure the physical consistency of ML surrogate models with the underlying dynamics, we can assess the model's stability properties. This evaluation holds profound implications for the explainability and interpretability of neural networks, when the stability properties of the original system are reproduced. 

We have successfully demonstrated that the LSTM reproduces the stability properties of multiple chaotic systems, such as the Kuramoto-Sivashinsky equation, even when trained on partial data. 

​
Material, activities, and people
This work was part of 
Reconstruction, forecasting, and stability of chaotic dynamics from partial data , E.Özalp, G.Margazoglou and L.Magri, Chaos (2023).
This research has received financial support from the ERC Starting Grant No. PhyCo 949388.


Article written by Elise Özalp, Alberto Racca and Luca Magri.
© 2024 Luca Magri
  • Luca Magri
    • Group
    • Collaborations
  • Publications
  • Research
    • Overview
    • Scientific machine learning >
      • Physics-aware machine learning
      • Chaotic time series forecasting
      • Nonlinear model reduction
      • Super-resolution and reconstruction
    • Real-time digital twins and data assimilation >
      • Inferring unknown unknowns: Bias-aware data assimilation
    • Optimization >
      • Bayesian optimisation
      • Chaotic systems
    • Mathematical modelling of multi-physics fluids >
      • Reacting flows and sound
    • Quantum computing and machine learning >
      • Solving nonlinear equations with quantum algorithms
      • Linear methods from quantum mechanics
    • Data and codes
  • Jobs/grants
  • Outreach
    • Research Centre in Data-Driven Engineering
    • Data-driven methods, machine learning and optimization
    • Data-driven Dynamical Systems Analysis
  • Consultancy
  • Teaching
    • University modules
    • Artificial intelligence for engineering
    • Mathematical methods
    • Misc
  • Contact