Towards Privacy and Utility in Tourette TIC Detection Through Pretraining Based on Publicly Available Video Data of Healthy Subjects
Publikation: Beitrag in Fachzeitschrift › Konferenzartikel › Beigetragen › Begutachtung
Beitragende
Abstract
Data privacy is typically particularly difficult to achieve in medical applications of machine learning, despite its importance in this area. The datasets are often small, which is why machine learning models such as neural networks tend to memorize information about the training data. This allows confidential and sensitive information about patients to be extracted from the model. Further challenging achieving data privacy is that the best possible utility must be ensured. In this work, we aim to detect tics based on video data of patients with Gilles de la Tourette syndrome. Facial landmarks were used as a lower-dimensional representation of the video data. Through membership inference attacks, we show that training a simple neural network directly on sensitive training data leaks information about the training data, and that this can be prevented by suitable pretraining on a large amount of unlabeled public data of healthy subjects. The proposed approach can not only reduce the attack accuracy to 52.15 %, but also achieves a high tic detection accuracy of 86.53 %. © 2023 IEEE.
Details
Originalsprache | Englisch |
---|---|
Fachzeitschrift | International Conference on Acoustics, Speech, and Signal Processing (ICASSP) |
Publikationsstatus | Veröffentlicht - 2023 |
Peer-Review-Status | Ja |
Konferenz
Titel | 48th IEEE International Conference on Acoustics, Speech and Signal Processing |
---|---|
Untertitel | Signal Processing in the AI era |
Kurztitel | ICASSP 2023 |
Veranstaltungsnummer | 48 |
Dauer | 4 - 10 Juni 2023 |
Webseite | |
Ort | Rodos Palace Luxury Convention Resort |
Stadt | Rhodes Island |
Land | Griechenland |
Externe IDs
ORCID | /0000-0002-2989-9561/work/151981738 |
---|
Schlagworte
ASJC Scopus Sachgebiete
Schlagwörter
- contrastive learning, membership inference, pretraining, privacy-preserving, representation learning