Nothing Special   »   [go: up one dir, main page]

Clozer”:” Adaptable Data Augmentation for Cloze-style Reading Comprehension

Holy Lovenia, Bryan Wilie, Willy Chung, Zeng Min, Samuel Cahyawijaya, Dan Su, Pascale Fung


Abstract
Task-adaptive pre-training (TAPT) alleviates the lack of labelled data and provides performance lift by adapting unlabelled data to downstream task. Unfortunately, existing adaptations mainly involve deterministic rules that cannot generalize well. Here, we propose Clozer, a sequence-tagging based cloze answer extraction method used in TAPT that is extendable for adaptation on any cloze-style machine reading comprehension (MRC) downstream tasks. We experiment on multiple-choice cloze-style MRC tasks, and show that Clozer performs significantly better compared to the oracle and state-of-the-art in escalating TAPT effectiveness in lifting model performance, and prove that Clozer is able to recognize the gold answers independently of any heuristics.
Anthology ID:
2022.repl4nlp-1.7
Volume:
Proceedings of the 7th Workshop on Representation Learning for NLP
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Spandana Gella, He He, Bodhisattwa Prasad Majumder, Burcu Can, Eleonora Giunchiglia, Samuel Cahyawijaya, Sewon Min, Maximilian Mozes, Xiang Lorraine Li, Isabelle Augenstein, Anna Rogers, Kyunghyun Cho, Edward Grefenstette, Laura Rimell, Chris Dyer
Venue:
RepL4NLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
60–66
Language:
URL:
https://aclanthology.org/2022.repl4nlp-1.7
DOI:
10.18653/v1/2022.repl4nlp-1.7
Bibkey:
Cite (ACL):
Holy Lovenia, Bryan Wilie, Willy Chung, Zeng Min, Samuel Cahyawijaya, Dan Su, and Pascale Fung. 2022. Clozer”:” Adaptable Data Augmentation for Cloze-style Reading Comprehension. In Proceedings of the 7th Workshop on Representation Learning for NLP, pages 60–66, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Clozer”:” Adaptable Data Augmentation for Cloze-style Reading Comprehension (Lovenia et al., RepL4NLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.repl4nlp-1.7.pdf
Video:
 https://aclanthology.org/2022.repl4nlp-1.7.mp4
Data
ReCAM