Day 4

Detailed paper information

Back to list

Paper title Sentinel-1 and Sentinel-2 time-series automatic analysis within a Copernicus Collaboration Ground Segment platform, Terrascope
  1. Sophie Petit ISSeP Speaker
  2. Coraline Wyard Institut Scientifique de Service Public (ISSeP)
  3. Gérard Swinnen Institut Scientifique de Service Public (ISSeP)
  4. Éric Hallot Institut Scientifique de Service Public - ISSeP
Form of presentation Poster
  • Open Earth Forum
    • C5.03 Open Source, data science and toolboxes in EO: Current status & evolution
Abstract text Within the framework of the SARSAR project, aiming to use the Sentinel satellite data of the European Copernicus program for the monitoring of redevelopment sites, a processing chain has been developed for change detection and classification. The need for the development of such a methodology arises from the context that the Walloon region, the southern part of Belgium, has to manage an inventory of more than 2220 “Redevelopment Sites” (RDS), which are mainly former abandoned industrial sites, representing a deconstruction of the urban canvas, but also offering an opportunity for sustainable urban planning thanks to their potential for redevelopment. The management of the inventory, which is mostly done by field visits, is costly in terms of both time and resources, and using Earth Observation data is a real opportunity to develop operational tool for the prioritization of the sites to further manually investigates. It allows selecting only the sites presenting signs of changes and already provides indication on what type of change to expect.

The general processing chain we have developed enables us to process the images in order to detect and classify changes and therefore provide a final report with the results directly usable by public authorities. More precisely in SARSAR, it consists of the three following successive blocks. The first block includes the following steps: selection of the relevant Sentinel data (selection of images based on the percentage of clouds for Setinel-2 ...), clipping based on the RDS polygons coming from the inventory vector file, extraction of the sigma0VH from Sentinel-1 and Sentinel-2 indices, linearly interpolation to fill in the gaps and smooth the data using a Gaussian kernel with a standard deviation of 61. These steps lead to the creation of a temporal profile by feature and by RDS. The second block consists first in applying the PELT (Pruned Exact Linear Time) change detection method. It is based on the solution of a minimization problem and is able to provide an exact segmentation of the temporal profiles. This allows to determine if a change has occurred or not, and if so to estimate the date of the change. Secondly, various Sentinel-2 indices and Sentinel-1 sigma0VH are used to determine the type of change (vegetation, building or soil), the direction of the change if any and its amplitude. Finally, the third block is the automatic production of reports, directly usable for the field operators, presenting the results by RDS and providing a priority order of the RDS to be investigated.

The processing chain have been implement in the Belgian Copernicus Collaborative Ground Segment, TERRASCOPE (managed by VITO) which offers, via virtual machines and Jupyter notebooks, pre-processed Sentinel data (L2A Sentinel-2) and computer capacity. This allows the whole workflow to be automated while processing a large amount of data and providing near real-time results.

The TERRA2SAR project presents the improvement done on the codes of the original processing chain, in order to share operational Python Jupyter Notebooks that can be reproduced in various scientific domains. The same type of processing chain could be useful to a larger scientific community and for other types of applications, specifically the monitoring of mid and long-term land-cover changes at a selection of sites of different sizes spread over large areas. For example it could be used to monitor the same type of brownfields but in other countries, as a decision support tool to make the distinction between different types of grasslands (temporary or permanent), to detect changes on specific sites (airports, ports, railroads …) …

The project is divided in two parts, the first one shows Notebook compatible with the standard TERRASCOPE virtual machines configuration. This methodology uses common gdal library and a SQL database engine, SQLite. It uses 8GB RAM, is single-thread due to the sqlite limitations, and accessible to one user at a time. In the end, this methodology is suitable for small or limited data sets either in terms of geographic or temporal footprint. It is easier to read and modify and lends itself better to experimentation. The second part provides Python Jupyter Notebook based on an upgraded TERRASCOPE configuration. This upgrade consists of moving to a dedicated machine with 24 GB RAM, 12 CPU cores, and a personalized PostgreSQL/PostGIS installation. This methodology is more stable and more efficient than the ‘SQLITE methodology’ as it allows faster computation and multi-threading. Moreover, it is accessible to several users/software at a time. As disadvantages, this methodology requires resources that are not part of the standard package for TERRASCOPE users, and more qualified personnel for implementation and maintenance. In the end, this methodology is suitable for production phase of applications which require the manipulation of big data sets. It should be noted that this upgraded version of the TERRASCOPE configuration is provided by VITO only on demand for other projects that might be interested in this configuration.