Nothing Special   »   [go: up one dir, main page]

Efficient Dialogue State Tracking by Selectively Overwriting Memory

Sungdong Kim, Sohee Yang, Gyuwan Kim, Sang-Woo Lee


Abstract
Recent works in dialogue state tracking (DST) focus on an open vocabulary-based setting to resolve scalability and generalization issues of the predefined ontology-based approaches. However, they are inefficient in that they predict the dialogue state at every turn from scratch. Here, we consider dialogue state as an explicit fixed-sized memory and propose a selectively overwriting mechanism for more efficient DST. This mechanism consists of two steps: (1) predicting state operation on each of the memory slots, and (2) overwriting the memory with new values, of which only a few are generated according to the predicted state operations. Our method decomposes DST into two sub-tasks and guides the decoder to focus only on one of the tasks, thus reducing the burden of the decoder. This enhances the effectiveness of training and DST performance. Our SOM-DST (Selectively Overwriting Memory for Dialogue State Tracking) model achieves state-of-the-art joint goal accuracy with 51.72% in MultiWOZ 2.0 and 53.01% in MultiWOZ 2.1 in an open vocabulary-based DST setting. In addition, we analyze the accuracy gaps between the current and the ground truth-given situations and suggest that it is a promising direction to improve state operation prediction to boost the DST performance.
Anthology ID:
2020.acl-main.53
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
567–582
Language:
URL:
https://aclanthology.org/2020.acl-main.53
DOI:
10.18653/v1/2020.acl-main.53
Bibkey:
Cite (ACL):
Sungdong Kim, Sohee Yang, Gyuwan Kim, and Sang-Woo Lee. 2020. Efficient Dialogue State Tracking by Selectively Overwriting Memory. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 567–582, Online. Association for Computational Linguistics.
Cite (Informal):
Efficient Dialogue State Tracking by Selectively Overwriting Memory (Kim et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.53.pdf
Video:
 http://slideslive.com/38929359
Code
 clovaai/som-dst +  additional community code
Data
MultiWOZ