Nothing Special   »   [go: up one dir, main page]

How Helpful is Inverse Reinforcement Learning for Table-to-Text Generation?

Sayan Ghosh, Zheng Qi, Snigdha Chaturvedi, Shashank Srivastava


Abstract
Existing approaches for the Table-to-Text task suffer from issues such as missing information, hallucination and repetition. Many approaches to this problem use Reinforcement Learning (RL), which maximizes a single manually defined reward, such as BLEU. In this work, we instead pose the Table-to-Text task as Inverse Reinforcement Learning (IRL) problem. We explore using multiple interpretable unsupervised reward components that are combined linearly to form a composite reward function. The composite reward function and the description generator are learned jointly. We find that IRL outperforms strong RL baselines marginally. We further study the generalization of learned IRL rewards in scenarios involving domain adaptation. Our experiments reveal significant challenges in using IRL for this task.
Anthology ID:
2021.acl-short.11
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
71–79
Language:
URL:
https://aclanthology.org/2021.acl-short.11
DOI:
10.18653/v1/2021.acl-short.11
Bibkey:
Cite (ACL):
Sayan Ghosh, Zheng Qi, Snigdha Chaturvedi, and Shashank Srivastava. 2021. How Helpful is Inverse Reinforcement Learning for Table-to-Text Generation?. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 71–79, Online. Association for Computational Linguistics.
Cite (Informal):
How Helpful is Inverse Reinforcement Learning for Table-to-Text Generation? (Ghosh et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-short.11.pdf
Optional supplementary material:
 2021.acl-short.11.OptionalSupplementaryMaterial.zip
Video:
 https://aclanthology.org/2021.acl-short.11.mp4
Code
 issacqzh/irl_table2text