Mr-Fosdick at SemEval-2023 Task 5: Comparing Dataset Expansion Techniques for Non-Transformer and Transformer Models: Improving Model Performance through Data Augmentation

Publikation: Beitrag in Buch/Konferenzbericht/Sammelband/GutachtenBeitrag in KonferenzbandBeigetragenBegutachtung

Beitragende

Abstract

In supervised learning, a significant amount of data is essential. To achieve this, we generated and evaluated datasets based on a provided dataset using transformer and non-transformer models. By utilizing these generated datasets during the training of new models, we attain a higher balanced accuracy during validation compared to using only the original dataset.

Details

OriginalspracheEnglisch
TitelProceedings of the 17th International Workshop on Semantic Evaluation (SemEval-2023)
Redakteure/-innenAtul Kr. Ojha, A. Seza Dogruoz, Giovanni Da San Martino, Harish Tayyar Madabushi, Ritesh Kumar, Elisa Sartori
Herausgeber (Verlag)Association for Computational Linguistics (ACL)
Seiten88-93
Seitenumfang6
ISBN (elektronisch)9781959429999
PublikationsstatusVeröffentlicht - 2023
Peer-Review-StatusJa

Konferenz

Titel17th International Workshop on Semantic Evaluation, SemEval 2023, co-located with the 61st Annual Meeting of the Association for Computational Linguistics, ACL 2023
Dauer13 - 14 Juli 2023
StadtHybrid, Toronto
LandKanada