Nothing Special   »   [go: up one dir, main page]

Attend, Memorize and Generate: Towards Faithful Table-to-Text Generation in Few Shots

Wenting Zhao, Ye Liu, Yao Wan, Philip Yu


Abstract
Few-shot table-to-text generation is a task of composing fluent and faithful sentences to convey table content using limited data. Despite many efforts having been made towards generating impressive fluent sentences by fine-tuning powerful pre-trained language models, the faithfulness of generated content still needs to be improved. To this end, this paper proposes a novel approach Attend, Memorize and Generate (called AMG), inspired by the text generation process of humans. In particular, AMG (1) attends over the multi-granularity of context using a novel strategy based on table slot level and traditional token-by-token level attention to exploit both the table structure and natural linguistic information; (2) dynamically memorizes the table slot allocation states; and (3) generates faithful sentences according to both the context and memory allocation states. Comprehensive experiments with human evaluation on three domains (i.e., humans, songs, and books) of the Wiki dataset show that our model can generate higher qualified texts when compared with several state-of-the-art baselines, in both fluency and faithfulness.
Anthology ID:
2021.findings-emnlp.347
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4106–4117
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.347
DOI:
10.18653/v1/2021.findings-emnlp.347
Bibkey:
Cite (ACL):
Wenting Zhao, Ye Liu, Yao Wan, and Philip Yu. 2021. Attend, Memorize and Generate: Towards Faithful Table-to-Text Generation in Few Shots. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 4106–4117, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Attend, Memorize and Generate: Towards Faithful Table-to-Text Generation in Few Shots (Zhao et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.347.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.347.mp4
Code
 wentinghome/amg
Data
WikiBio