Day 4

Detailed paper information

Back to list

Paper title Reconstructing forest scene using a ‘tree library’: testing the impact of reconstruction ratio on radiative transfer simulations
  1. Chang Liu Ghent University Speaker
  2. Kim Calders Ghent University
  3. Félicien Meunier Ghent University
  4. Jean-Philippe Gastellu-Etchegorry Centre d'Etudes Spatiales de la Biosphère
  5. Joanne Nightingale CSX Carbon
  6. Mathias Disney University College London (UCL),
  7. Niall Origo National Physical Laboratory; University College London
  8. William Woodgate The University of Queensland; CSIRO
  9. Hans Verbeeck Ghent University
Form of presentation Poster
  • C2. Digital Twins
    • C2.01 Towards a Digital Twin of the Earth - advances and challenges ahead
Abstract text Radiative Transfer Models (RTMs) with spatially explicit 3D forest structures can simulate highly realistic Earth Observation data at large spatial scales (10s to 100s of m). These RTMs can help understand forest ecosystem processes and its interaction with the Earth system, as well as make much more effective use of new Earth Observation data. However, explicitly reconstructing 3D forest models at large scale (> 1 ha) requires a tremendous amount of 3D structural, spectral and other information. It is time and labor consuming, sometimes impossible to conduct such a reconstruction work at large scale. Instead, reconstructing the forest by using a “tree library” is a more practical and feasible method. Here, this library is made up of 3D trees with different characterizations (e.g., tree species, height, and diameter at breast height) that are a representative sample for the whole forest stand. This library of tree forms is used to reconstruct a full forest scene at a large scale (e.g., 100 x 100 m). By using this method, the spatial scale of the reconstructed forest scene can be easily increased to match with the possible applications (e.g., understanding forest radiative transfer processes, retrieval algorithm development, sensor design, or remote sensing calibration and validation activities.)
In this study, we investigated the optimal way to build such a tree library using different reconstruction ratios. We evaluated the accuracy of different scenarios by comparing simulated drone data with actual drone remote sensing images. More specifically, trees were clustered into different groups according to their species, height, and diameter at breast height. The number of these groups was determined according to the reconstruction ratio: the number of groups is equal to the number of trees multiplied by the reconstruction ratio. The range of reconstruction ratio is 0 to 1. For each group, a random tree was selected. 3D models of other trees in this group were replaced by this selected tree in the simulation. We evaluated the accuracy of the new forest scenes by using the Bidirectional Reflectance Factor (BRF, top of canopy). The simulated BRFs of the forest scenes, which were built with different reconstruction ratios, were compared with the drone data to evaluate their accuracy. We conducted the experiments in hyperspectral resolutions (32 wavebands from 520.44nm to 885.86 nm. We show that using new 3D measurement technology and this “tree library” method it is possible to reconstruct forest scenes with cm-scale accuracy at large spatial scale (> 1 ha), and use these as the basis of new RTM simulation tools.