Nothing Special   »   [go: up one dir, main page]

Memory Consolidation for Contextual Spoken Language Understanding with Dialogue Logistic Inference

He Bai, Yu Zhou, Jiajun Zhang, Chengqing Zong


Abstract
Dialogue contexts are proven helpful in the spoken language understanding (SLU) system and they are typically encoded with explicit memory representations. However, most of the previous models learn the context memory with only one objective to maximizing the SLU performance, leaving the context memory under-exploited. In this paper, we propose a new dialogue logistic inference (DLI) task to consolidate the context memory jointly with SLU in the multi-task framework. DLI is defined as sorting a shuffled dialogue session into its original logical order and shares the same memory encoder and retrieval mechanism as the SLU model. Our experimental results show that various popular contextual SLU models can benefit from our approach, and improvements are quite impressive, especially in slot filling.
Anthology ID:
P19-1541
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5448–5453
Language:
URL:
https://aclanthology.org/P19-1541
DOI:
10.18653/v1/P19-1541
Bibkey:
Cite (ACL):
He Bai, Yu Zhou, Jiajun Zhang, and Chengqing Zong. 2019. Memory Consolidation for Contextual Spoken Language Understanding with Dialogue Logistic Inference. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 5448–5453, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Memory Consolidation for Contextual Spoken Language Understanding with Dialogue Logistic Inference (Bai et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1541.pdf