A surgical activity model of laparoscopic cholecystectomy for co-operation with collaborative robots

Publikation: Beitrag in FachzeitschriftForschungsartikelBeigetragenBegutachtung

Beitragende

Abstract

Background: Laparoscopic cholecystectomy is a very frequent surgical procedure. However, in an ageing society, less surgical staff will need to perform surgery on patients. Collaborative surgical robots (cobots) could address surgical staff shortages and workload. To achieve context-awareness for surgeon-robot collaboration, the intraoperative action workflow recognition is a key challenge. Methods: A surgical process model was developed for intraoperative surgical activities including actor, instrument, action and target in laparoscopic cholecystectomy (excluding camera guidance). These activities, as well as instrument presence and surgical phases were annotated in videos of laparoscopic cholecystectomy performed on human patients (n = 10) and on explanted porcine livers (n = 10). The machine learning algorithm Distilled-Swin was trained on our own annotated dataset and the CholecT45 dataset. The validation of the model was conducted using a fivefold cross-validation approach. Results: In total, 22,351 activities were annotated with a cumulative duration of 24.9 h of video segments. The machine learning algorithm trained and validated on our own dataset scored a mean average precision (mAP) of 25.7% and a top K = 5 accuracy of 85.3%. With training and validation on our dataset and CholecT45, the algorithm scored a mAP of 37.9%. Conclusions: An activity model was developed and applied for the fine-granular annotation of laparoscopic cholecystectomies in two surgical settings. A machine recognition algorithm trained on our own annotated dataset and CholecT45 achieved a higher performance than training only on CholecT45 and can recognize frequently occurring activities well, but not infrequent activities. The analysis of an annotated dataset allowed for the quantification of the potential of collaborative surgical robots to address the workload of surgical staff. If collaborative surgical robots could grasp and hold tissue, up to 83.5% of the assistant’s tissue interacting tasks (i.e. excluding camera guidance) could be performed by robots.

Details

OriginalspracheEnglisch
Seiten (von - bis)4316-4328
Seitenumfang13
FachzeitschriftSurgical endoscopy
Jahrgang38 (2024)
Ausgabenummer8
PublikationsstatusVeröffentlicht - 13 Juni 2024
Peer-Review-StatusJa

Externe IDs

Mendeley d6b1e71e-4eb1-3737-9edb-5452bef872d6
ORCID /0000-0002-4590-1908/work/163294152
PubMed 38872018

Schlagworte

ASJC Scopus Sachgebiete

Schlagwörter

  • Action recognition, Collaborative surgical robots, Machine learning, Robot autonomy, Surgical data science, Surgical process modeling