Nothing Special   »   [go: up one dir, main page]

Chain-of-Rewrite: Aligning Question and Documents for Open-Domain Question Answering

Chunlei Xin, Yaojie Lu, Hongyu Lin, Shuheng Zhou, Huijia Zhu, Weiqiang Wang, Zhongyi Liu, Xianpei Han, Le Sun


Abstract
Despite the advancements made with the retrieve-then-read pipeline on open-domain question answering task, current methods still face challenges stemming from term mismatch and limited interaction between information retrieval systems and large language models. To mitigate these issues, we propose the Chain-of-Rewrite method, which leverages the guidance and feedback gained from the analysis to provide faithful and consistent extensions for effective question answering. Through a two-step rewriting process comprising Semantic Analysis and Semantic Augmentation, the Chain-of-Rewrite method effectively bridges the gap between the user question and relevant documents. By incorporating feedback from the rewriting process, our method can self-correct the retrieval and reading process to further improve the performance. Experiments on four open-domain question answering datasets demonstrate the effectiveness of our system under zero-shot settings.
Anthology ID:
2024.findings-emnlp.104
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1884–1896
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.104
DOI:
Bibkey:
Cite (ACL):
Chunlei Xin, Yaojie Lu, Hongyu Lin, Shuheng Zhou, Huijia Zhu, Weiqiang Wang, Zhongyi Liu, Xianpei Han, and Le Sun. 2024. Chain-of-Rewrite: Aligning Question and Documents for Open-Domain Question Answering. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 1884–1896, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Chain-of-Rewrite: Aligning Question and Documents for Open-Domain Question Answering (Xin et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-emnlp.104.pdf