General In-Hand Object Rotation with Vision and Touch

Research output: Contribution to journalConference articleContributedpeer-review

Contributors

Abstract

We introduce RotateIt, a system that enables fingertip-based object rotation along multiple axes by leveraging multimodal sensory inputs. Our system is trained in simulation, where it has access to ground-truth object shapes and physical properties. Then we distill it to operate on realistic yet noisy simulated visuotactile and proprioceptive sensory inputs. These multimodal inputs are fused via a visuotactile transformer, enabling online inference of object shapes and physical properties during deployment. We show significant performance improvements over prior methods and the importance of visual and tactile sensing.

Details

Original languageEnglish
JournalProceedings of Machine Learning Research
Volume229
Publication statusPublished - 2023
Peer-reviewedYes

Conference

Title7th Conference on Robot Learning, CoRL 2023
Duration6 - 9 November 2023
CityAtlanta
CountryUnited States of America

External IDs

ORCID /0000-0001-9430-8433/work/158768044

Keywords

Keywords

  • In-Hand Object Rotation, Reinforcement Learning, Sim-to-Real, Tactile Sensing, Transformer, Visuotactile Manipulation