Nothing Special   »   [go: up one dir, main page]

InDeep × NMT: Empowering Human Translators via Interpretable Neural Machine Translation

Gabriele Sarti, Arianna Bisazza


Abstract
Neural machine translation (NMT) systems are nowadays essential components of professional translation workflows. Consequently, human translators are increasingly working as post-editors for machine-translated content. The NWO-funded InDeep project aims to empower users of Deep Learning models of text, speech, and music by improving their ability to interact with such models and interpret their behaviors. In the specific context of translation, we aim at developing new tools and methodologies to improve prediction attribution, error analysis, and controllable generation for NMT systems. These advances will be evaluated through field studies involving professional translators to assess gains in efficiency and overall enjoyability of the post-editing process.
Anthology ID:
2022.eamt-1.46
Volume:
Proceedings of the 23rd Annual Conference of the European Association for Machine Translation
Month:
June
Year:
2022
Address:
Ghent, Belgium
Editors:
Helena Moniz, Lieve Macken, Andrew Rufener, Loïc Barrault, Marta R. Costa-jussà, Christophe Declercq, Maarit Koponen, Ellie Kemp, Spyridon Pilos, Mikel L. Forcada, Carolina Scarton, Joachim Van den Bogaert, Joke Daems, Arda Tezcan, Bram Vanroy, Margot Fonteyne
Venue:
EAMT
SIG:
Publisher:
European Association for Machine Translation
Note:
Pages:
313–314
Language:
URL:
https://aclanthology.org/2022.eamt-1.46
DOI:
Bibkey:
Cite (ACL):
Gabriele Sarti and Arianna Bisazza. 2022. InDeep × NMT: Empowering Human Translators via Interpretable Neural Machine Translation. In Proceedings of the 23rd Annual Conference of the European Association for Machine Translation, pages 313–314, Ghent, Belgium. European Association for Machine Translation.
Cite (Informal):
InDeep × NMT: Empowering Human Translators via Interpretable Neural Machine Translation (Sarti & Bisazza, EAMT 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.eamt-1.46.pdf