If a therapy bot walks like a duck and talks like a duck then it is a medically regulated duck

Research output: Contribution to journalComment/DebateContributedpeer-review

Abstract

Large language models (LLMs) are increasingly used for mental health interactions, often mimicking therapeutic behaviour without regulatory oversight. Documented harms, including suicides, highlight the urgent need for stronger safeguards. This manuscript argues that LLMs providing therapy-like functions should be regulated as medical devices, with standards ensuring safety, transparency and accountability. Pragmatic regulation is essential to protect vulnerable users and maintain the credibility of digital health interventions.

Details

Original languageEnglish
Article number741
Journal npj digital medicine
Volume8
Issue number1
Publication statusPublished - 5 Dec 2025
Peer-reviewedYes

External IDs

ORCID /0000-0002-3730-5348/work/199963873
ORCID /0000-0002-1997-1689/work/199963891
ORCID /0000-0003-3323-2492/work/199963912
ORCID /0009-0004-7808-2701/work/199964112
Scopus 105024135832

Keywords

Sustainable Development Goals