Nothing Special   »   [go: up one dir, main page]

Zero-shot Faithful Factual Error Correction

Kung-Hsiang Huang, Hou Pong Chan, Heng Ji


Abstract
Faithfully correcting factual errors is critical for maintaining the integrity of textual knowledge bases and preventing hallucinations in sequence-to-sequence models. Drawing on humans’ ability to identify and correct factual errors, we present a zero-shot framework that formulates questions about input claims, looks for correct answers in the given evidence, and assesses the faithfulness of each correction based on its consistency with the evidence. Our zero-shot framework outperforms fully-supervised approaches, as demonstrated by experiments on the FEVER and SciFact datasets, where our outputs are shown to be more faithful. More importantly, the decomposability nature of our framework inherently provides interpretability. Additionally, to reveal the most suitable metrics for evaluating factual error corrections, we analyze the correlation between commonly used metrics with human judgments in terms of three different dimensions regarding intelligibility and faithfulness.
Anthology ID:
2023.acl-long.311
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5660–5676
Language:
URL:
https://aclanthology.org/2023.acl-long.311
DOI:
10.18653/v1/2023.acl-long.311
Bibkey:
Cite (ACL):
Kung-Hsiang Huang, Hou Pong Chan, and Heng Ji. 2023. Zero-shot Faithful Factual Error Correction. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 5660–5676, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Zero-shot Faithful Factual Error Correction (Huang et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.311.pdf
Video:
 https://aclanthology.org/2023.acl-long.311.mp4