General In-Hand Object Rotation with Vision and Touch
Research output: Contribution to journal › Conference article › Contributed › peer-review
Contributors
Abstract
We introduce RotateIt, a system that enables fingertip-based object rotation along multiple axes by leveraging multimodal sensory inputs. Our system is trained in simulation, where it has access to ground-truth object shapes and physical properties. Then we distill it to operate on realistic yet noisy simulated visuotactile and proprioceptive sensory inputs. These multimodal inputs are fused via a visuotactile transformer, enabling online inference of object shapes and physical properties during deployment. We show significant performance improvements over prior methods and the importance of visual and tactile sensing.
Details
Original language | English |
---|---|
Journal | Proceedings of Machine Learning Research |
Volume | 229 |
Publication status | Published - 2023 |
Peer-reviewed | Yes |
Conference
Title | 7th Conference on Robot Learning |
---|---|
Abbreviated title | CoRL 2023 |
Conference number | 7 |
Duration | 6 - 9 November 2023 |
Website | |
Degree of recognition | International event |
Location | Starling Hotel |
City | Atlanta |
Country | United States of America |
External IDs
ORCID | /0000-0001-9430-8433/work/158768044 |
---|
Keywords
ASJC Scopus subject areas
Keywords
- In-Hand Object Rotation, Reinforcement Learning, Sim-to-Real, Tactile Sensing, Transformer, Visuotactile Manipulation