Named Entity Recognition for Specific Domains - Take Advantage of Transfer Learning
Research output: Contribution to journal › Research article › Contributed › peer-review
Contributors
Abstract
Automated text analysis as named entity recognition (NER) heavily relies on large amounts of high-quality training data. For domain-specific NER transfer learning approaches aim to overcome the problem of lacking domain-specific training data. In this paper, we investigate transfer learning approaches in order to improve domain-specific NER in low-ressource domains. The first part of the paper is dedicated to information transfer from known to unknown entities using BiLSTM-CRF neural networks, considering also the influence of varying training data size. In the second part instead, pre-trained BERT models are fine-tuned to domain-specific German NER. The performance of models of both architectures is compared w.r.t. different hyperparameters and a set of 16 entities. The experiments are based on the revised German SmartData Corpus, and a baseline model, trained on this corpus.
Details
Original language | English |
---|---|
Pages (from-to) | 4-15 |
Number of pages | 12 |
Journal | International Journal of Information Science and Technology |
Volume | 6 |
Issue number | 3 |
Publication status | Published - 2022 |
Peer-reviewed | Yes |
External IDs
ORCID | /0000-0001-9756-6390/work/142250121 |
---|
Keywords
Keywords
- Automated text analysis, NER