Evetac: An Event-Based Optical Tactile Sensor for Robotic Manipulation

Publikation: Beitrag in FachzeitschriftForschungsartikelBeigetragenBegutachtung

Beitragende

Abstract

Optical tactile sensors have recently become popular. They provide high spatial resolution, but struggle to offer fine temporal resolutions. To overcome this shortcoming, we study the idea of replacing the RGB camera with an event-based camera and introduce a new event-based optical tactile sensor called Evetac. Along with hardware design, we develop touch processing algorithms to process its measurements online at 1000 Hz. We devise an efficient algorithm to track the elastomer's deformation through the imprinted markers despite the sensor's sparse output. Benchmarking experiments demonstrate Evetac's capabilities of sensing vibrations up to 498 Hz, reconstructing shear forces, and significantly reducing data rates compared to RGB optical tactile sensors. Moreover, Evetac's output and the marker tracking provide meaningful features for learning data-driven slip detection and prediction models. The learned models form the basis for a robust and adaptive closed-loop grasp controller capable of handling a wide range of objects. We believe that fast and efficient event-based tactile sensors like Evetac will be essential for bringing human-like manipulation capabilities to robotics.

Details

OriginalspracheEnglisch
Seiten (von - bis)3812-3832
Seitenumfang21
FachzeitschriftIEEE Transactions on Robotics
Jahrgang40
PublikationsstatusVeröffentlicht - 2024
Peer-Review-StatusJa

Externe IDs

ORCID /0000-0001-9430-8433/work/173989267

Schlagworte

Schlagwörter

  • Deep learning in robotics and automation, event-based sensing, force and tactile sensing, perception for grasping and manipulation