Nothing Special   »   [go: up one dir, main page]

Models of reference production: How do they withstand the test of time?

Fahime Same, Guanyi Chen, Kees van Deemter


Abstract
In recent years, many NLP studies have focused solely on performance improvement. In this work, we focus on the linguistic and scientific aspects of NLP. We use the task of generating referring expressions in context (REG-in-context) as a case study and start our analysis from GREC, a comprehensive set of shared tasks in English that addressed this topic over a decade ago. We ask what the performance of models would be if we assessed them (1) on more realistic datasets, and (2) using more advanced methods. We test the models using different evaluation metrics and feature selection experiments. We conclude that GREC can no longer be regarded as offering a reliable assessment of models’ ability to mimic human reference production, because the results are highly impacted by the choice of corpus and evaluation metrics. Our results also suggest that pre-trained language models are less dependent on the choice of corpus than classic Machine Learning models, and therefore make more robust class predictions.
Anthology ID:
2023.inlg-main.7
Volume:
Proceedings of the 16th International Natural Language Generation Conference
Month:
September
Year:
2023
Address:
Prague, Czechia
Editors:
C. Maria Keet, Hung-Yi Lee, Sina Zarrieß
Venues:
INLG | SIGDIAL
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
93–105
Language:
URL:
https://aclanthology.org/2023.inlg-main.7
DOI:
10.18653/v1/2023.inlg-main.7
Bibkey:
Cite (ACL):
Fahime Same, Guanyi Chen, and Kees van Deemter. 2023. Models of reference production: How do they withstand the test of time?. In Proceedings of the 16th International Natural Language Generation Conference, pages 93–105, Prague, Czechia. Association for Computational Linguistics.
Cite (Informal):
Models of reference production: How do they withstand the test of time? (Same et al., INLG-SIGDIAL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.inlg-main.7.pdf