Nothing Special   »   [go: up one dir, main page]

ENGINE: Energy-Based Inference Networks for Non-Autoregressive Machine Translation

Lifu Tu, Richard Yuanzhe Pang, Sam Wiseman, Kevin Gimpel


Abstract
We propose to train a non-autoregressive machine translation model to minimize the energy defined by a pretrained autoregressive model. In particular, we view our non-autoregressive translation system as an inference network (Tu and Gimpel, 2018) trained to minimize the autoregressive teacher energy. This contrasts with the popular approach of training a non-autoregressive model on a distilled corpus consisting of the beam-searched outputs of such a teacher model. Our approach, which we call ENGINE (ENerGy-based Inference NEtworks), achieves state-of-the-art non-autoregressive results on the IWSLT 2014 DE-EN and WMT 2016 RO-EN datasets, approaching the performance of autoregressive models.
Anthology ID:
2020.acl-main.251
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2819–2826
Language:
URL:
https://aclanthology.org/2020.acl-main.251
DOI:
10.18653/v1/2020.acl-main.251
Bibkey:
Cite (ACL):
Lifu Tu, Richard Yuanzhe Pang, Sam Wiseman, and Kevin Gimpel. 2020. ENGINE: Energy-Based Inference Networks for Non-Autoregressive Machine Translation. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 2819–2826, Online. Association for Computational Linguistics.
Cite (Informal):
ENGINE: Energy-Based Inference Networks for Non-Autoregressive Machine Translation (Tu et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.251.pdf
Video:
 http://slideslive.com/38929336
Code
 lifu-tu/ENGINE