If a therapy bot walks like a duck and talks like a duck then it is a medically regulated duck

Publikation: Beitrag in FachzeitschriftKommentar (Comment) / Leserbriefe ohne eigene DatenBeigetragenBegutachtung

Abstract

Large language models (LLMs) are increasingly used for mental health interactions, often mimicking therapeutic behaviour without regulatory oversight. Documented harms, including suicides, highlight the urgent need for stronger safeguards. This manuscript argues that LLMs providing therapy-like functions should be regulated as medical devices, with standards ensuring safety, transparency and accountability. Pragmatic regulation is essential to protect vulnerable users and maintain the credibility of digital health interventions.

Details

OriginalspracheEnglisch
Aufsatznummer741
Fachzeitschrift npj digital medicine
Jahrgang8
Ausgabenummer1
PublikationsstatusVeröffentlicht - 5 Dez. 2025
Peer-Review-StatusJa

Externe IDs

ORCID /0000-0002-3730-5348/work/199963873
ORCID /0000-0002-1997-1689/work/199963891
ORCID /0000-0003-3323-2492/work/199963912
ORCID /0009-0004-7808-2701/work/199964112
Scopus 105024135832

Schlagworte

Ziele für nachhaltige Entwicklung