If a therapy bot walks like a duck and talks like a duck then it is a medically regulated duck
Publikation: Beitrag in Fachzeitschrift › Kommentar (Comment) / Leserbriefe ohne eigene Daten › Beigetragen › Begutachtung
Beitragende
Abstract
Large language models (LLMs) are increasingly used for mental health interactions, often mimicking therapeutic behaviour without regulatory oversight. Documented harms, including suicides, highlight the urgent need for stronger safeguards. This manuscript argues that LLMs providing therapy-like functions should be regulated as medical devices, with standards ensuring safety, transparency and accountability. Pragmatic regulation is essential to protect vulnerable users and maintain the credibility of digital health interventions.
Details
| Originalsprache | Englisch |
|---|---|
| Aufsatznummer | 741 |
| Fachzeitschrift | npj digital medicine |
| Jahrgang | 8 |
| Ausgabenummer | 1 |
| Publikationsstatus | Veröffentlicht - 5 Dez. 2025 |
| Peer-Review-Status | Ja |
Externe IDs
| ORCID | /0000-0002-3730-5348/work/199963873 |
|---|---|
| ORCID | /0000-0002-1997-1689/work/199963891 |
| ORCID | /0000-0003-3323-2492/work/199963912 |
| ORCID | /0009-0004-7808-2701/work/199964112 |
| Scopus | 105024135832 |