Evetac: An Event-Based Optical Tactile Sensor for Robotic Manipulation

Research output: Contribution to journalResearch articleContributedpeer-review

Contributors

Abstract

Optical tactile sensors have recently become popular. They provide high spatial resolution, but struggle to offer fine temporal resolutions. To overcome this shortcoming, we study the idea of replacing the RGB camera with an event-based camera and introduce a new event-based optical tactile sensor called Evetac. Along with hardware design, we develop touch processing algorithms to process its measurements online at 1000 Hz. We devise an efficient algorithm to track the elastomer's deformation through the imprinted markers despite the sensor's sparse output. Benchmarking experiments demonstrate Evetac's capabilities of sensing vibrations up to 498 Hz, reconstructing shear forces, and significantly reducing data rates compared to RGB optical tactile sensors. Moreover, Evetac's output and the marker tracking provide meaningful features for learning data-driven slip detection and prediction models. The learned models form the basis for a robust and adaptive closed-loop grasp controller capable of handling a wide range of objects. We believe that fast and efficient event-based tactile sensors like Evetac will be essential for bringing human-like manipulation capabilities to robotics.

Details

Original languageEnglish
Pages (from-to)3812-3832
Number of pages21
JournalIEEE Transactions on Robotics
Volume40
Publication statusPublished - 15 Jul 2024
Peer-reviewedYes

External IDs

ORCID /0000-0001-9430-8433/work/173989267

Keywords

Keywords

  • Deep learning in robotics and automation, event-based sensing, force and tactile sensing, perception for grasping and manipulation