Automated Workflow for High-Resolution 4D Vegetation Monitoring Using Stereo Vision

Research output: Contribution to journalResearch articleContributedpeer-review

Contributors

  • Martin Kobe - , Helmholtz Centre for Environmental Research (Author)
  • Melanie Elias - , Chair of Photogrammetry (Author)
  • Ines Merbach - , Helmholtz Centre for Environmental Research (Author)
  • Martin Schädler - , Helmholtz Centre for Environmental Research, German Centre for Integrative Biodiversity Research (iDiv) Halle—Jena—Leipzig (Author)
  • Jan Bumberger - , Helmholtz Centre for Environmental Research, German Centre for Integrative Biodiversity Research (iDiv) Halle—Jena—Leipzig (Author)
  • Marion Pause - , Anhalt University of Applied Sciences (Author)
  • Hannes Mollenhauer - , Helmholtz Centre for Environmental Research (Author)

Abstract

Precision agriculture relies on understanding crop growth dynamics and plant responses to short-term changes in abiotic factors. In this technical note, we present and discuss a technical approach for cost-effective, non-invasive, time-lapse crop monitoring that automates the process of deriving further plant parameters, such as biomass, from 3D object information obtained via stereo images in the red, green, and blue (RGB) color space. The novelty of our approach lies in the automated workflow, which includes a reliable automated data pipeline for 3D point cloud reconstruction from dynamic scenes of RGB images with high spatio-temporal resolution. The setup is based on a permanent rigid and calibrated stereo camera installation and was tested over an entire growing season of winter barley at the Global Change Experimental Facility (GCEF) in Bad Lauchstädt, Germany. For this study, radiometrically aligned image pairs were captured several times per day from 3 November 2021 to 28 June 2022. We performed image preselection using a random forest (RF) classifier with a prediction accuracy of 94.2% to eliminate unsuitable, e.g., shadowed, images in advance and obtained 3D object information for 86 records of the time series using the 4D processing option of the Agisoft Metashape software package, achieving mean standard deviations (STDs) of 17.3–30.4 mm. Finally, we determined vegetation heights by calculating cloud-to-cloud (C2C) distances between a reference point cloud, computed at the beginning of the time-lapse observation, and the respective point clouds measured in succession with an absolute error of 24.9–35.6 mm in depth direction. The calculated growth rates derived from RGB stereo images match the corresponding reference measurements, demonstrating the adequacy of our method in monitoring geometric plant traits, such as vegetation heights and growth spurts during the stand development using automated workflows

Details

Original languageEnglish
Article number541
Number of pages15
JournalRemote sensing
Volume16 (2024)
Issue number3
Publication statusPublished - 31 Jan 2024
Peer-reviewedYes

External IDs

unpaywall 10.3390/rs16030541
Mendeley 4fc770f0-cd4e-305e-a7aa-5628582ab019
Scopus 85184848399

Keywords

Keywords

  • non-invasive setup, precision agriculture, image classification, 4D vegetation monitoring, stereo vision