General In-Hand Object Rotation with Vision and Touch

Publikation: Beitrag in FachzeitschriftKonferenzartikelBeigetragenBegutachtung

Beitragende

Abstract

We introduce RotateIt, a system that enables fingertip-based object rotation along multiple axes by leveraging multimodal sensory inputs. Our system is trained in simulation, where it has access to ground-truth object shapes and physical properties. Then we distill it to operate on realistic yet noisy simulated visuotactile and proprioceptive sensory inputs. These multimodal inputs are fused via a visuotactile transformer, enabling online inference of object shapes and physical properties during deployment. We show significant performance improvements over prior methods and the importance of visual and tactile sensing.

Details

OriginalspracheEnglisch
FachzeitschriftProceedings of Machine Learning Research
Jahrgang229
PublikationsstatusVeröffentlicht - 2023
Peer-Review-StatusJa

Konferenz

Titel7th Conference on Robot Learning, CoRL 2023
Dauer6 - 9 November 2023
StadtAtlanta
LandUSA/Vereinigte Staaten

Externe IDs

ORCID /0000-0001-9430-8433/work/158768044

Schlagworte

Schlagwörter

  • In-Hand Object Rotation, Reinforcement Learning, Sim-to-Real, Tactile Sensing, Transformer, Visuotactile Manipulation