Long-term NWP-based SMOS monitoring
Peter Weston | European Centre for Medium-Range Weather Forecasts (ECMWF) | United Kingdom
SMOS observations are operationally monitored against short-range numerical weather prediction (NWP) forecasts at ECMWF. The NWP forecasts are transformed from model space into observation space by the Community Microwave Emissivity Model (CMEM) and these simulated brightness temperatures (Tbs) are subtracted from the observed SMOS L1C Tbs to calculate background (or first-guess) departures. The statistics of these departures are plotted as time series, Hovmöller and map plots to monitor the quality of the SMOS observations.
In general, the short-range NWP forecasts provide a stable and high-quality baseline to compare the SMOS observations against. However, every so often, the NWP forecast model and monitoring system are updated with the latest scientific developments. In addition, the SMOS observed Tbs are periodically re-calibrated to reduce residual biases, improve noise performance or enhance quality control. Both eventualities can result in sudden changes to the monitoring statistics, making longer-term trends more difficult to identify and analyse.
One solution to this issue is to perform long-term monitoring using a fixed NWP model version, fixed monitoring system configuration and fixed version of the SMOS L1C Tbs. In this presentation, results are presented from such a system using ERA-5 reanalysis model fields, the ECMWF monitoring system at a recently operational model cycle (47r2) and the latest version (v724) of the SMOS L1C Tbs. This combination makes use of the stability and high quality of the ERA5 model fields, recent developments in the SMOS quality control procedures at ECMWF, and improvements to the SMOS L1 processor at v724.
Monitoring statistics from this system will be presented including long-term trends over the entire lifetime (2010-present) of the SMOS mission. In particular, SMOS has been significantly affected by radio frequency interference (RFI) at the L-band (1.41GHz) frequency at which it measures throughout its lifetime. The latest version of the SMOS L1 processor includes a significant improvement to the RFI screening compared to the previous versions. The effectiveness of this screening over time will be assessed by looking at the long-term NWP monitoring statistics.
EVDC – Experience in archiving and data management in the Calibration and Validation domain
Jarek Dobrzanski | Skytek | Ireland
The EVDC (initially Envisat Validation Data Centre) was set up originally in 2000 to support the commissioning of the atmospheric instruments on-board the Envisat satellite. It provided a central database for all routinely acquired correlative geophysical data used in the Envisat Calibration and Validation, and provided tools for the ingestion, quality control, and retrieval of data from this database. With the new EVDC (ESA atmospheric Validation Centre) project that started in 2016, the system has been modernised and a large set of new functionalities and tools have been added including satellite element (access to Sentinel-5 Precursor, Aeolus and MIPAS data). The EVDC is accessible at https://evdc.esa.int/ and has become an important tool for the Satellite user community, including the Sentinel-5 Precursor Mission Performance Centre that accesses the EVDC to retrieve regularly ground based Cal/Val data and Fiducial Reference Measurements for the routine validation of the satellite products.
The value of the historic Cal/Val data and those acquired in ongoing activities has been recognised, and their long term preservation and management have been supported through the definition and application of the Generic Earth Observation Metadata Standard (GEOMS) to ensure harmonised specification and reporting of Cal/Val data and metadata. Application of GEOMS is a core element in data management, quality assessment and ensuring interoperability with other agencies and data centres. To also support interoperability and federation with data holdings of other data centres, EVDC applies metadata harvesting and sharing methods based on ESA’s Data Centre Inter Operability (DCIO) project and using the Open Archives Initiative Protocol for Metadata Harvesting (OAI-PMH).
Data exchange with the EVDC is regulated by a protocol with the aim to ensure data ownership, to prevent re-distribution to third parties and to protect intellectual properties. EVDC provides, moreover, a service to issue DOIs and Landing Pages for Cal/Val datasets.
As part of continued effort to make the EVDC platform more accessible a set of video tutorials that explain how to work with most of the tools is made available through the EVDC Web Platform (see EVDC Newsletter: https://evdc.esa.int/documentation/newsletters/).
The original heritage from the original Envisat Validation Data Centre continues to be built on to provide long-term stability of Cal/Val data archiving (data holdings are around 2 million Cal/Val files, including Campaign Data and Rapid Delivery Data) and improvements in data management and discoverability, with further evolutions anticipated to cover new functionalities and missions over the coming years.
The presentation will address the following topics and lessons learned:
• The advantages and challenges of the application of metadata standards for Cal/Val data, and their application to historical as well as currently operational data;
• The improvements in Cal/Val data management which are achievable through applying these standards;
• The role of EVDC in federated Cal/Val data to encourage cooperation between the various data archives and promote open data policy, as well as providing robustness (e.g. as a backup archive for networks such as Pandonia);
• Evolutions planned for new missions and in data management, including:
• EarthCARE preparations, including GEOMS evolutions and using DCIO to establish links to additional data centres relevant for the mission;
• Implementation of myEVDC functionality following a typical exploitation platform model;
• A potential pilot project to assess the inclusion of Cal/Val data in ESA’s Data Information System;
• A Collocation Reference Database, with a major aim of improving delta validations of processor upgrades prior to systematic reprocessing.
We will also discuss potential applications of EVDC’s experience in Cal/Val data archiving, management and processing to other application domains.
Earth Observation Data Information Service
Alessandra Paciucci | SERCO Italia, Frascati, Italy | Italy
ESRIN, the European Space Agency’s center for Earth observation has commissioned a centralized service to archive, store metadata relevant to Earth Observation data, collecting relevant metadata as it is generated in the different systems and services of the ground segment. This paper describes how the service has evolved through its various phases; how the metadata is organized, and the way in which the Service provides standard interfaces to allow data to be collected from additional external sources.
The service was developed and deployed into operation as part of DSI (Data Service Initiative) contract lead by Serco. It consists of metadata extraction from several value-adding systems and services and collection for every dataset and data product into a centralized system. This allows all changes to the data to be managed and traceable, similar to any other items under standard configuration control. DIS includes a powerful presentation layer that provides a unique insight: an overview of data holdings, data provenance, processing history of products, datasets and missions.
As the number of satellites increase, generating more data and the archives stretching back over a longer time period, the need to maintain knowledge of the data becomes more challenging. DIS is a helpful tool for retaining visibility and understanding of these EO data assets, as well as for comparing the contents of different archives to ensure that datasets are fully identified, differences in archives are understood and specific holdings can be compared.
The stakeholders include mostly Management, which DIS assists with decision making, and Operations, who are supported by a detailed and accurate account of the location, status and cross-relationship of the data.
The approach used to handle heterogeneous sources of EO data and provide support to ESA Management and Operations will be presented.
DAMPS, an ESA Service of EO data Archival, Retrieval, Maintenance, Consolidation and (re)Processing
Dr. Olivia Lesne | ACRI-ST | France
DAMPS, whose acronym means D[ata] A[rchival], M[anagement] & P[rocessing] S[ervices] and where PDGS stands for “Payload Data Ground Segments”, has been commissioned in 2021 by the European Space Agency (ESA) under various programmes including Long-Term Data Preservation (LTDP-2).
DAMPS provides a secure and reliable archive, data configuration management, retrieval/collection, maintenance, consolidation and (re-)processing and formatting of ESA’s Earth Observation data. The purpose of the activity is the delivery of services to cover a coherent set of value-adding activities focused on the data.
The backbone characteristics of the Service are - safe data archival, - accurate data management, - responsiveness to user needs, - adaptability to users’ situation, - data management & processing as a service, - smooth interfaces between all the services, hence building-up an end-to-end value chain for the data.
(1) DAMPS Data Processing Projects are hosted in the same infrastructure used for the archive, providing easy and fast access to the EO data and an efficient way to process them, avoiding the export of data to other premises. All work is performed on servers inside the DAMPS network, ensuring data safety and a maximum control of data curation in the archive.
(2) The archive is distributed between two locations separated by more than 1000 km: a Master Archive in Luxembourg (Betzdorf) and a Backup Archive in the south of France (Sophia-Antipolis). The archive currently contains 8.2 PB of data from 57 ESA and Third-Party Missions (TPM). It is composed of historical and live missions data as well as in-situ data “campaigns” in EO-SIP and native format. The ingestion is realized in various ways such as bulk ingestion from media or on-line ingestion. The service is organized to cover future ESA Earth Explorers as well as any additional Third-Party missions.
(3) Data integrity is ensured by the systematic use of data containers including error-detection for the AIP (Archival Information Package). Another crucial activity performed by DAMPS is the data quality check at all steps of the data archiving (from ingestion to delivery), not in terms of scientific content but in terms of reliability of the process during data transfer. Moreover, routine data container integrity is performed across the archive to check that no archive data is corrupted.
(4) the DMS (Data Management System) of DAMPS keeps track of the accurate contents of the archive as well as the value-adding operations performed on it (consolidation, processing, circulation…) and the relationships between datasets. DAMPS is also home to the DIS (Data Information System) which collects data and value-added information from DAMPS itself and other services to provide a one-stop overview of the contents and provenance of the data.
(5) A DAMPS web portal (damps.info) presents the description of all DAMPS services and related news. It also provides useful interfaces showing for instance an up-to-date status and a monitoring of the data ingestion in near real time as well as an overview of the full archive content. It also provides a dedicated space to DIS including a link to the operational DIS and a full overview of the planned, current and completed Projects.
To reach that stage, ESA put together the scope of two predecessor services (DAS - Data Archival Service and DSI - Data Service Initiative) to increase the overall efficiency while building upon the experience gained.
The service is delivered by an industrial consortium with a rich mix of skills and experience in EO data and service management, composed of ACRI-ST (FR, Prime contractor), adwäisEO (LU), Serco (IT), ARGANS (UK) and KSAT (NO) and two additional partners Solenix (DE), S&T (NL).
COPA - Copernicus: 4 Core Products Algorithm Studies -- Impact of the SLSTR geometry configuration versus (A)ATSR geometry configuration on retrieved time series
Dr. Pekka Kolmonen | Finnish Meteorological Institute | Finland
The main geometric difference potentially affecting the L2 retrieval results between the (A)ATSR (onboard ERS2 and Envisat) and the SLSTR (onboard Sentinel-3 a and b) instruments is the change of the oblique view from 55° forward to 55° backwards. This leads to situations where, especially in the oblique view, the forward scattering conditions of the (A)ATSR instrument are replaced by backward scattering conditions of the SLSTR instrument. The orientation of the oblique view relative to the satellite track, can affect the quality of the aerosol products at the global scale by the conjunction of two reasons. On the one hand, the TOA signal strength (and thus the resulting radiometric noise) and, on the other hand, the sensitivity of the measurement to aerosol and surface parameters, which depend on the observing geometry and on the azimuthal scattering/reflection angle.
In this study the analysis of the impact of the geometry has been carried out with simulations and time-series investigations where results from (A)ATSR and SLSTR retrievals have been compared. The results presented here are related to the AOD (aerosol optical depth) and surface reflectance products. The main goal if the study is to give recommendations of how to combine time series from (A)ATSR and SLSTR and give uncertainty estimates.
Sentinel-3 scenes were created with forward simulations for the (A)ATSR and SLSTR geometries. The forward simulations were calculated in two ways. First, the forward model of the aerosol retrieval algorithm (ESA synergy processor SYN_2_AOD) was used to compute the Top-Of-Atmosphere (TOA) reflectance. Second, independent simulations for the scenes were prepared using a surface reflectance database (ESA ADAM) coupled with an independently defined atmosphere. The TOA reflectance was computed using the libRadtran radiative transfer algorithm. The two approaches enabled the study of various geometry related aspects of the aerosol and surface reflectance retrieval.
To study how the (A)ATSR and SLSTR AOD evolve and agree, time series were produced for global and regional AOD. Here also auxiliary information from e.g. MODIS or MISR instruments was employed as there is a several years gap between the (A)ATSR and SLSTR products. While it is quite straightforward to create rules to correct the SLSTR bias with respect to the gap filling information, there are several issues which need to be considered. For instance, the retrieval geometry difference between (A)ATSR and SLSTR could simply be treated as a North-South phenomenon because of the mirroring of the oblique view of the instruments. This is not, however, true for the whole globe as there are regional variation in between the observed biases. Also, the reference MODIS/MISR products need to be scrutinized with respect to the validation bias time series. This bias cannot vary too much in the (A)ATSR/SLSTR gap to determine reliable (A)ATSR/SLSTR biases with respect to MODIS/MISR.
We will present the main results of the simulations and time series analysis. In addition, some initial recommendations are presented and discussed.
Snow Cover in Europe derived from historical AVHRR Data – a TIMELINE thematic processor
Dr. Andreas Jürgen Dietz | German Aerospace Center (DLR), German Remote Sensing Data Center (DFD) | Germany
The AVHRR (Advanced Very High Resolution Radiometer) mission offers a data set that goes back continuously to the early 1980s. These data are processed and examined as part of the TIMELINE (TIme series processing of Medium resolution Earth observation data assessing Long-term dynamics In our Natural Environment) project, which focuses on a 30-year homogenized time series of NOAA and METOP-AVHRR data from Europe and North Africa. This time series is generated using the historical data archive of the German Remote Sensing Data Center (DFD) at the German Aerospace Center (DLR), which reaches back until 1981. The main goal of TIMELINE is to develop consistent, reproducible, transparent, reliable and generic geoscientific variables for research related to global change. These outputs are used to answer climate-related questions, enable the detection of changes and identify geoscientific phenomena and trends. A wide range of geoscientific products is generated within TIMELINE and made available to the public via a free and open data policy. The processing of these products follows a sophisticated sequence of individual steps, including preprocessing (calibration, chip matching, orthorectification), atmospheric and BRDF correction, cloud masking and the generation of geophysical and thematic products on level 2 and level 3. All processing steps are based on solid operational algorithms and are thoroughly documented and validated. One of these thematic products is the snow cover, which is calculated on the basis of the atmospherically corrected Level 1B data.
Snow has the largest spatial extent of the entire cryosphere, but at the same time also has the greatest seasonality. To determine the water content of the snow cover, passive microwave remote sensing was used early on, but for a precise investigation of the variability of the spatial extent of the snow cover, its geometric resolution is insufficient. By using multispectral sensors with a higher geometrical resolution, one takes advantage of an important property of snow: Snow has a very high reflection in the visible spectral range (VIS) and a very low reflection in the short-wave infrared (SWIR). This property is used in the Normalized Difference Snow Index (NDSI), whereby the difference between the reflection of VIS and SWIR is divided by their sum. The index can have a value between -1 and +1, whereby an NDSI greater than 0.4 stands for a snow cover of over 50%. In areas with dense vegetation cover, the NDSI can adapt lower values (up to 0.1) even when the ground is completely covered with snow; these are determined by the additional calculation of the Normalized Difference Vegetation Index (NDVI). In the TIMELINE project, AVHRR bands 1 and 3A are used for the NDSI calculation, AVHRR bands 1 and 2 for NDVI. If only band 3B exists, the reflective component is calculated from the radiation measurements and an artificial band 3A is formed. The NDSI is also suitable for distinguishing between snow and clouds, but fails when it comes to ice clouds. Therefore, ERA5 skin temperature data is included and the temperature difference is used as a differentiation criterion. The resulting level 2 product contains a thematic snow classification with an associated quality layer for each AVHRR scene in swath projection.
The Level 2 products are first brought into the TIMELINE projection (ETRS89-extended / LAEA Europe) and divided into 4 tiles in order to create daily composites. The pixel-based value assignment is based on the quality layers. The percentage of snow cover over 10 days and per month is then calculated from the daily products (except for cloud-covered pixels). For better comparability with the 8-day MODIS snow product, an aggregated 8-day snow cover is also calculated. The daily data is also used to obtain cloud-free images using the DLR Global SnowPack processor. From this, the snow cover duration, the beginning and end of a snow cover season can be calculated for each pixel. This is used for trend analysis of the entire time series and for assessing the accuracy with the help of the MODIS product, which has existed since 2000.