Day 4

Detailed paper information

Back to list

Paper title Predicting Short and Long-term Sea Level Changes Using Deep Learning
Authors
  1. Mads Ehrhorn DTU Space Speaker
  2. Ole Baltazar Andersen DTU Space
  3. Carsten Bjerre Ludwigsen Technical University of Denmark
Form of presentation Poster
Topics
  • C1. AI and Data Analytics
    • C1.04 AI4EO applications for Land and Water
Abstract text Predicting short- and long-term sea-level changes is a critical task with deep implications for both the safety and job-security of a large part of the world's population.

The satellite altimetry data record is now nearly 30 years old, and we may begin to consider employing it in a deep learning (DL)---and, by definition, data-hungry---context, a somewhat unexplored territory until now.

Even though Global Mean Sea Level (GMSL) largely changes linearly with time (3 mm/year), this global average exhibits large geographical variations and covers a suite of regional non-linear signals, changing in both space and time.
Because DL can capture the non-linearity of the system, it offers an intriguing promise.
Furthermore, improving the mapping and understanding of these regional signals will enhance our ability to project sea level changes into the future.

The use of machine learning techniques in altimetry settings has been hampered previously, due to the lack of data, while explainability of DL models has been an issue, as has the computing requirements.
In addition, machine learning models do not generally output uncertainties in their predictions.

Today, though, datasets have approached a suitable size, model explainability is solved by permutation importance and SHAP values, computing is cheap and it is possible to include information on uncertainties as well.
These can be handled by either appropriate loss functions, ensemble techniques or Bayesian methods, which means the time has come to employ 30 years of satellite altimetry data to improve our predictive power in sea-level changes.

The types of dataset will vary according to the problem area: for climate and long term changes, averaged monthly low resolution records will be adequate.
However, for studies of extreme events, like flooding, we need daily or better averages on as high a spatial resolution as possible.
This will increase the amount of data many-fold.

This project will focus on the above problems in both global and regional settings, and we will try to model some extremel sea level events causing flooding in the past.

The presentation will highlight our vision for 1) what is the best way to structure the data and make it available for other teams pursuing DL applications, 2) how do we constantly incorporate new data into the model to prevent data drift, 3) what is the best way to ensure predictions contain uncertainties and 4) how do we make the model available for consumption using cloud technologies?