Nothing Special   »   [go: up one dir, main page]

Findings of the Word-Level AutoCompletion Shared Task in WMT 2023

Lemao Liu, Francisco Casacuberta, George Foster, Guoping Huang, Philipp Koehn, Geza Kovacs, Shuming Shi, Taro Watanabe, Chengqing Zong


Abstract
This paper presents the overview of the second Word-Level autocompletion (WLAC) shared task for computer-aided translation, which aims to automatically complete a target word given a translation context including a human typed character sequence. We largely adhere to the settings of the previous round of the shared task, but with two main differences: 1) The typed character sequence is obtained from the typing process of human translators to demonstrate system performance under real-world scenarios when preparing some type of testing examples; 2) We conduct a thorough analysis on the results of the submitted systems from three perspectives. From the experimental results, we observe that translation tasks are helpful to improve the performance of WLAC models. Additionally, our further analysis shows that the semantic error accounts for a significant portion of all errors, and thus it would be promising to take this type of errors into account in future.
Anthology ID:
2023.wmt-1.53
Volume:
Proceedings of the Eighth Conference on Machine Translation
Month:
December
Year:
2023
Address:
Singapore
Editors:
Philipp Koehn, Barry Haddow, Tom Kocmi, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
654–662
Language:
URL:
https://aclanthology.org/2023.wmt-1.53
DOI:
10.18653/v1/2023.wmt-1.53
Bibkey:
Cite (ACL):
Lemao Liu, Francisco Casacuberta, George Foster, Guoping Huang, Philipp Koehn, Geza Kovacs, Shuming Shi, Taro Watanabe, and Chengqing Zong. 2023. Findings of the Word-Level AutoCompletion Shared Task in WMT 2023. In Proceedings of the Eighth Conference on Machine Translation, pages 654–662, Singapore. Association for Computational Linguistics.
Cite (Informal):
Findings of the Word-Level AutoCompletion Shared Task in WMT 2023 (Liu et al., WMT 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.wmt-1.53.pdf