AI-based LiDAR / Camera data fusion to enable high-resolution 3D surface reconstruction for autonomous asteroid exploration mission
Research output: Contribution to book/Conference proceedings/Anthology/Report › Conference contribution › Contributed › peer-review
Contributors
Abstract
For future science and exploration operations close to surfaces of Small Solar System Bodies (SSSBs) a high-resolution 3D terrain model is needed for Guidance, Navigation, and Control (GNC) tasks. The relatively low resolution of available flash-LiDAR devices mainly limits the resolution of the detected SSSB's 3D terrain. This paper proposes a method to perform a deep learning-based fusion of low-resolution depth images and high-resolution monocular grayscale 2D camera images to overcome this limitation and increase the resolution of the 3D terrain data acquired by flash-LiDAR. We use a Generative Adversarial Network (GAN) based architecture to process the 3D terrain data of the irregular and unstructured surface of asteroids or comets. A synthetic dataset of 10,000 samples based on comet 67P/Churyumov-Gerasimenko was generated by using a high-fidelity rendering software for training and validation. Our method is suitable for wide-angle lens applications and shows robustness to varying illumination conditions. A resolution increase by a factor of 8x8 was achieved.
Details
| Original language | English |
|---|---|
| Title of host publication | 2023, AAS/AIAA Astrodynamics Specialist Conference |
| Publication status | Published - 2023 |
| Peer-reviewed | Yes |
External IDs
| ORCID | /0009-0004-0484-6297/work/196694813 |
|---|