Named Entity Recognition for Specific Domains - Take Advantage of Transfer Learning

Research output: Contribution to journalResearch articleContributedpeer-review

Abstract

Automated text analysis as named entity recognition (NER) heavily relies on large amounts of high-quality training data. For domain-specific NER transfer learning approaches aim to overcome the problem of lacking domain-specific training data. In this paper, we investigate transfer learning approaches in order to improve domain-specific NER in low-ressource domains. The first part of the paper is dedicated to information transfer from known to unknown entities using BiLSTM-CRF neural networks, considering also the influence of varying training data size. In the second part instead, pre-trained BERT models are fine-tuned to domain-specific German NER. The performance of models of both architectures is compared w.r.t. different hyperparameters and a set of 16 entities. The experiments are based on the revised German SmartData Corpus, and a baseline model, trained on this corpus.

Details

Original languageEnglish
Pages (from-to)4-15
Number of pages12
JournalInternational Journal of Information Science and Technology
Volume6
Issue number3
Publication statusPublished - 2022
Peer-reviewedYes

External IDs

ORCID /0000-0001-9756-6390/work/142250121

Keywords

Keywords

  • Automated text analysis, NER

Library keywords