Nothing Special   »   [go: up one dir, main page]

Debugging Sequence-to-Sequence Models with Seq2Seq-Vis

Hendrik Strobelt, Sebastian Gehrmann, Michael Behrisch, Adam Perer, Hanspeter Pfister, Alexander Rush


Abstract
Neural attention-based sequence-to-sequence models (seq2seq) (Sutskever et al., 2014; Bahdanau et al., 2014) have proven to be accurate and robust for many sequence prediction tasks. They have become the standard approach for automatic translation of text, at the cost of increased model complexity and uncertainty. End-to-end trained neural models act as a black box, which makes it difficult to examine model decisions and attribute errors to a specific part of a model. The highly connected and high-dimensional internal representations pose a challenge for analysis and visualization tools. The development of methods to understand seq2seq predictions is crucial for systems in production settings, as mistakes involving language are often very apparent to human readers. For instance, a widely publicized incident resulted from a translation system mistakenly translating “good morning” into “attack them” leading to a wrongful arrest (Hern, 2017).
Anthology ID:
W18-5451
Volume:
Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP
Month:
November
Year:
2018
Address:
Brussels, Belgium
Editors:
Tal Linzen, Grzegorz Chrupała, Afra Alishahi
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
368–370
Language:
URL:
https://aclanthology.org/W18-5451
DOI:
10.18653/v1/W18-5451
Bibkey:
Cite (ACL):
Hendrik Strobelt, Sebastian Gehrmann, Michael Behrisch, Adam Perer, Hanspeter Pfister, and Alexander Rush. 2018. Debugging Sequence-to-Sequence Models with Seq2Seq-Vis. In Proceedings of the 2018 EMNLP Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP, pages 368–370, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Debugging Sequence-to-Sequence Models with Seq2Seq-Vis (Strobelt et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-5451.pdf