CNNs improve decoding of selective attention to speech in cochlear implant users

Publikation: Beitrag in FachzeitschriftForschungsartikelBeigetragenBegutachtung

Beitragende

Abstract

Objective. Understanding speech in the presence of background noise such as other speech streams is a difficult problem for people with hearing impairment, and in particular for users of cochlear implants (CIs). To improve their listening experience, auditory attention decoding (AAD) aims to decode the target speaker of a listener from electroencephalography (EEG), and then use this information to steer an auditory prosthesis towards this speech signal. In normal-hearing individuals, deep neural networks (DNNs) have been shown to improve AAD compared to simpler linear models. We aim to demonstrate that DNNs can improve attention decoding in CI users too, which would make them the state-of-the-art candidate for a neuro-steered CI. Approach. To this end, we first collected an EEG dataset on selective auditory attention from 25 bilateral CI users, and then implemented both a linear model as well as a convolutional neural network (CNN) for attention decoding. Moreover, we introduced a novel, objective CI-artifact removal strategy and evaluated its impact on decoding accuracy, alongside learnable speaker classification using a support vector machine (SVM). Main results. The CNN outperformed the linear model across all decision window sizes from 1 to 60 s. Removing CI artifacts modestly improved the CNN's decoding accuracy. With SVM classification, the CNN decoder reached a peak mean decoding accuracy of 74% at the population level for a 60 s decision window. Significance. These results demonstrate the superior potential of CNN-based decoding for neuro-steered CIs, which could improve speech perception of its users in cocktail party situations significantly.

Details

OriginalspracheEnglisch
Aufsatznummer036034
Seitenumfang17
FachzeitschriftJournal of neural engineering
Jahrgang22
Ausgabenummer3
PublikationsstatusVeröffentlicht - 10 Juni 2025
Peer-Review-StatusJa

Externe IDs

Scopus 105008083512
ORCID /0000-0002-8487-9977/work/186621112
ORCID /0000-0002-5009-1719/work/186621385
ORCID /0009-0009-6314-7126/work/186621392

Schlagworte

Schlagwörter

  • cochlear implant (CI), convolutional neural network (CNN), brain-computer interface (BCI), EEG, auditory attention decoding (AAD)