Named Entity Recognition for Specific Domains - Take Advantage of Transfer Learning

Publikation: Beitrag in FachzeitschriftForschungsartikelBeigetragenBegutachtung

Abstract

Automated text analysis as named entity recognition (NER) heavily relies on large amounts of high-quality training data. For domain-specific NER transfer learning approaches aim to overcome the problem of lacking domain-specific training data. In this paper, we investigate transfer learning approaches in order to improve domain-specific NER in low-ressource domains. The first part of the paper is dedicated to information transfer from known to unknown entities using BiLSTM-CRF neural networks, considering also the influence of varying training data size. In the second part instead, pre-trained BERT models are fine-tuned to domain-specific German NER. The performance of models of both architectures is compared w.r.t. different hyperparameters and a set of 16 entities. The experiments are based on the revised German SmartData Corpus, and a baseline model, trained on this corpus.

Details

OriginalspracheEnglisch
Seiten (von - bis)4-15
Seitenumfang12
FachzeitschriftInternational Journal of Information Science and Technology
Jahrgang6
Ausgabenummer3
PublikationsstatusVeröffentlicht - 2022
Peer-Review-StatusJa

Externe IDs

ORCID /0000-0001-9756-6390/work/142250121

Schlagworte

Schlagwörter

  • Automated text analysis, NER

Bibliotheksschlagworte