Automated Workflow for High-Resolution 4D Vegetation Monitoring Using Stereo Vision

Publikation: Beitrag in FachzeitschriftForschungsartikelBeigetragenBegutachtung

Beitragende

  • Martin Kobe - , Helmholtz-Zentrum für Umweltforschung (UFZ) (Autor:in)
  • Melanie Elias - , Professur für Photogrammetrie (Autor:in)
  • Ines Merbach - , Helmholtz-Zentrum für Umweltforschung (UFZ) (Autor:in)
  • Martin Schädler - , Helmholtz-Zentrum für Umweltforschung (UFZ), Deutsches Zentrum für integrative Biodiversitätsforschung (iDiv) Halle-Jena-Leipzig (Autor:in)
  • Jan Bumberger - , Helmholtz-Zentrum für Umweltforschung (UFZ), Deutsches Zentrum für integrative Biodiversitätsforschung (iDiv) Halle-Jena-Leipzig (Autor:in)
  • Marion Pause - , Anhalt University of Applied Sciences (Autor:in)
  • Hannes Mollenhauer - , Helmholtz-Zentrum für Umweltforschung (UFZ) (Autor:in)

Abstract

Precision agriculture relies on understanding crop growth dynamics and plant responses to short-term changes in abiotic factors. In this technical note, we present and discuss a technical approach for cost-effective, non-invasive, time-lapse crop monitoring that automates the process of deriving further plant parameters, such as biomass, from 3D object information obtained via stereo images in the red, green, and blue (RGB) color space. The novelty of our approach lies in the automated workflow, which includes a reliable automated data pipeline for 3D point cloud reconstruction from dynamic scenes of RGB images with high spatio-temporal resolution. The setup is based on a permanent rigid and calibrated stereo camera installation and was tested over an entire growing season of winter barley at the Global Change Experimental Facility (GCEF) in Bad Lauchstädt, Germany. For this study, radiometrically aligned image pairs were captured several times per day from 3 November 2021 to 28 June 2022. We performed image preselection using a random forest (RF) classifier with a prediction accuracy of 94.2% to eliminate unsuitable, e.g., shadowed, images in advance and obtained 3D object information for 86 records of the time series using the 4D processing option of the Agisoft Metashape software package, achieving mean standard deviations (STDs) of 17.3–30.4 mm. Finally, we determined vegetation heights by calculating cloud-to-cloud (C2C) distances between a reference point cloud, computed at the beginning of the time-lapse observation, and the respective point clouds measured in succession with an absolute error of 24.9–35.6 mm in depth direction. The calculated growth rates derived from RGB stereo images match the corresponding reference measurements, demonstrating the adequacy of our method in monitoring geometric plant traits, such as vegetation heights and growth spurts during the stand development using automated workflows

Details

OriginalspracheEnglisch
Aufsatznummer541
FachzeitschriftRemote sensing
Jahrgang16
Ausgabenummer3
PublikationsstatusVeröffentlicht - 31 Jan. 2024
Peer-Review-StatusJa

Externe IDs

unpaywall 10.3390/rs16030541
Mendeley 4fc770f0-cd4e-305e-a7aa-5628582ab019
Scopus 85184848399

Schlagworte

Schlagwörter

  • non-invasive setup, precision agriculture, image classification, 4D vegetation monitoring, stereo vision