Nothing Special   »   [go: up one dir, main page]

QE BERT: Bilingual BERT Using Multi-task Learning for Neural Quality Estimation

Hyun Kim, Joon-Ho Lim, Hyun-Ki Kim, Seung-Hoon Na


Abstract
For translation quality estimation at word and sentence levels, this paper presents a novel approach based on BERT that recently has achieved impressive results on various natural language processing tasks. Our proposed model is re-purposed BERT for the translation quality estimation and uses multi-task learning for the sentence-level task and word-level subtasks (i.e., source word, target word, and target gap). Experimental results on Quality Estimation shared task of WMT19 show that our systems show competitive results and provide significant improvements over the baseline.
Anthology ID:
W19-5407
Volume:
Proceedings of the Fourth Conference on Machine Translation (Volume 3: Shared Task Papers, Day 2)
Month:
August
Year:
2019
Address:
Florence, Italy
Editors:
Ondřej Bojar, Rajen Chatterjee, Christian Federmann, Mark Fishel, Yvette Graham, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, André Martins, Christof Monz, Matteo Negri, Aurélie Névéol, Mariana Neves, Matt Post, Marco Turchi, Karin Verspoor
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
85–89
Language:
URL:
https://aclanthology.org/W19-5407
DOI:
10.18653/v1/W19-5407
Bibkey:
Cite (ACL):
Hyun Kim, Joon-Ho Lim, Hyun-Ki Kim, and Seung-Hoon Na. 2019. QE BERT: Bilingual BERT Using Multi-task Learning for Neural Quality Estimation. In Proceedings of the Fourth Conference on Machine Translation (Volume 3: Shared Task Papers, Day 2), pages 85–89, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
QE BERT: Bilingual BERT Using Multi-task Learning for Neural Quality Estimation (Kim et al., WMT 2019)
Copy Citation:
PDF:
https://aclanthology.org/W19-5407.pdf