Nothing Special   »   [go: up one dir, main page]

Character-based Decoding in Tree-to-Sequence Attention-based Neural Machine Translation

Akiko Eriguchi, Kazuma Hashimoto, Yoshimasa Tsuruoka


Abstract
This paper reports our systems (UT-AKY) submitted in the 3rd Workshop of Asian Translation 2016 (WAT’16) and their results in the English-to-Japanese translation task. Our model is based on the tree-to-sequence Attention-based NMT (ANMT) model proposed by Eriguchi et al. (2016). We submitted two ANMT systems: one with a word-based decoder and the other with a character-based decoder. Experimenting on the English-to-Japanese translation task, we have confirmed that the character-based decoder can cover almost the full vocabulary in the target language and generate translations much faster than the word-based model.
Anthology ID:
W16-4617
Volume:
Proceedings of the 3rd Workshop on Asian Translation (WAT2016)
Month:
December
Year:
2016
Address:
Osaka, Japan
Editors:
Toshiaki Nakazawa, Hideya Mino, Chenchen Ding, Isao Goto, Graham Neubig, Sadao Kurohashi, Ir. Hammam Riza, Pushpak Bhattacharyya
Venue:
WAT
SIG:
Publisher:
The COLING 2016 Organizing Committee
Note:
Pages:
175–183
Language:
URL:
https://aclanthology.org/W16-4617
DOI:
Bibkey:
Cite (ACL):
Akiko Eriguchi, Kazuma Hashimoto, and Yoshimasa Tsuruoka. 2016. Character-based Decoding in Tree-to-Sequence Attention-based Neural Machine Translation. In Proceedings of the 3rd Workshop on Asian Translation (WAT2016), pages 175–183, Osaka, Japan. The COLING 2016 Organizing Committee.
Cite (Informal):
Character-based Decoding in Tree-to-Sequence Attention-based Neural Machine Translation (Eriguchi et al., WAT 2016)
Copy Citation:
PDF:
https://aclanthology.org/W16-4617.pdf
Data
ASPEC