Nothing Special   »   [go: up one dir, main page]

GraphPlan: Story Generation by Planning with Event Graph

Hong Chen, Raphael Shu, Hiroya Takamura, Hideki Nakayama


Abstract
Story generation is a task that aims to automatically generate a meaningful story. This task is challenging because it requires high-level understanding of the semantic meaning of sentences and causality of story events. Naivesequence-to-sequence models generally fail to acquire such knowledge, as it is difficult to guarantee logical correctness in a text generation model without strategic planning. In this study, we focus on planning a sequence of events assisted by event graphs and use the events to guide the generator. Rather than using a sequence-to-sequence model to output a sequence, as in some existing works, we propose to generate an event sequence by walking on an event graph. The event graphs are built automatically based on the corpus. To evaluate the proposed approach, we incorporate human participation, both in event planning and story generation. Based on the largescale human annotation results, our proposed approach has been shown to provide more logically correct event sequences and stories compared with previous approaches.
Anthology ID:
2021.inlg-1.42
Volume:
Proceedings of the 14th International Conference on Natural Language Generation
Month:
August
Year:
2021
Address:
Aberdeen, Scotland, UK
Editors:
Anya Belz, Angela Fan, Ehud Reiter, Yaji Sripada
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
377–386
Language:
URL:
https://aclanthology.org/2021.inlg-1.42
DOI:
10.18653/v1/2021.inlg-1.42
Bibkey:
Cite (ACL):
Hong Chen, Raphael Shu, Hiroya Takamura, and Hideki Nakayama. 2021. GraphPlan: Story Generation by Planning with Event Graph. In Proceedings of the 14th International Conference on Natural Language Generation, pages 377–386, Aberdeen, Scotland, UK. Association for Computational Linguistics.
Cite (Informal):
GraphPlan: Story Generation by Planning with Event Graph (Chen et al., INLG 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.inlg-1.42.pdf