Mr-Fosdick at SemEval-2023 Task 5: Comparing Dataset Expansion Techniques for Non-Transformer and Transformer Models: Improving Model Performance through Data Augmentation
Research output: Contribution to book/Conference proceedings/Anthology/Report › Conference contribution › Contributed › peer-review
Contributors
Abstract
In supervised learning, a significant amount of data is essential. To achieve this, we generated and evaluated datasets based on a provided dataset using transformer and non-transformer models. By utilizing these generated datasets during the training of new models, we attain a higher balanced accuracy during validation compared to using only the original dataset.
Details
Original language | English |
---|---|
Title of host publication | Proceedings of the 17th International Workshop on Semantic Evaluation (SemEval-2023) |
Editors | Atul Kr. Ojha, A. Seza Dogruoz, Giovanni Da San Martino, Harish Tayyar Madabushi, Ritesh Kumar, Elisa Sartori |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 88-93 |
Number of pages | 6 |
ISBN (electronic) | 9781959429999 |
Publication status | Published - 2023 |
Peer-reviewed | Yes |
Conference
Title | 17th International Workshop on Semantic Evaluation, SemEval 2023, co-located with the 61st Annual Meeting of the Association for Computational Linguistics, ACL 2023 |
---|---|
Duration | 13 - 14 July 2023 |
City | Hybrid, Toronto |
Country | Canada |