Nothing Special   »   [go: up one dir, main page]

Context-Aware Transformer Pre-Training for Answer Sentence Selection

Luca Di Liello, Siddhant Garg, Alessandro Moschitti


Abstract
Answer Sentence Selection (AS2) is a core component for building an accurate Question Answering pipeline. AS2 models rank a set of candidate sentences based on how likely they answer a given question. The state of the art in AS2 exploits pre-trained transformers by transferring them on large annotated datasets, while using local contextual information around the candidate sentence. In this paper, we propose three pre-training objectives designed to mimic the downstream fine-tuning task of contextual AS2. This allows for specializing LMs when fine-tuning for contextual AS2. Our experiments on three public and two large-scale industrial datasets show that our pre-training approaches (applied to RoBERTa and ELECTRA) can improve baseline contextual AS2 accuracy by up to 8% on some datasets.
Anthology ID:
2023.acl-short.40
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
458–468
Language:
URL:
https://aclanthology.org/2023.acl-short.40
DOI:
10.18653/v1/2023.acl-short.40
Bibkey:
Cite (ACL):
Luca Di Liello, Siddhant Garg, and Alessandro Moschitti. 2023. Context-Aware Transformer Pre-Training for Answer Sentence Selection. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 458–468, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Context-Aware Transformer Pre-Training for Answer Sentence Selection (Di Liello et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-short.40.pdf
Video:
 https://aclanthology.org/2023.acl-short.40.mp4