Knowledge-aware Dialogue Generation with Hybrid Attention (Student Abstract)
DOI:
https://doi.org/10.1609/aaai.v35i18.17972Keywords:
Dialogue Generation, Knowledge Graph, Co-attentionAbstract
Using commonsense knowledge to assist dialogue generation is a big step forward for dialogue generation task. However, how to fully utilize commonsense information is always a challenge. Furthermore, the entities generated in the response do not match the information in the post most often. In this paper, we propose a dialogue generation model which uses hybrid attention to better generate rational entities. When a user post is given, the model encodes relevant knowledge graphs from a knowledge base with a graph attention mechanism. Then it will encode the user post and graphs with a co-attention mechanism, which effectively encodes complex related data. Through the above mechanism, we can get a better mutual understanding of post and knowledge. The experimental results show that our model is more effective than the current state-of-the-art model (CCM).Downloads
Published
2021-05-18
How to Cite
Zhao, Y., Cheng, B., & Zhang, Y. (2021). Knowledge-aware Dialogue Generation with Hybrid Attention (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 35(18), 15951-15952. https://doi.org/10.1609/aaai.v35i18.17972
Issue
Section
AAAI Student Abstract and Poster Program