Nothing Special   »   [go: up one dir, main page]

HW-TSC’s Submissions to the WMT23 Discourse-Level Literary Translation Shared Task

Yuhao Xie, Zongyao Li, Zhanglin Wu, Daimeng Wei, Xiaoyu Chen, Zhiqiang Rao, Shaojun Li, Hengchao Shang, Jiaxin Guo, Lizhi Lei, Hao Yang, Yanfei Jiang


Abstract
This paper introduces HW-TSC’s submission to the WMT23 Discourse-Level Literary Translation shared task. We use standard sentence-level transformer as a baseline, and perform domain adaptation and discourse modeling to enhance discourse-level capabilities. Regarding domain adaptation, we employ Back-Translation, Forward-Translation and Data Diversification. For discourse modeling, we apply strategies such as Multi-resolutional Document-to-Document Translation and TrAining Data Augmentation.
Anthology ID:
2023.wmt-1.32
Volume:
Proceedings of the Eighth Conference on Machine Translation
Month:
December
Year:
2023
Address:
Singapore
Editors:
Philipp Koehn, Barry Haddow, Tom Kocmi, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
302–306
Language:
URL:
https://aclanthology.org/2023.wmt-1.32
DOI:
10.18653/v1/2023.wmt-1.32
Bibkey:
Cite (ACL):
Yuhao Xie, Zongyao Li, Zhanglin Wu, Daimeng Wei, Xiaoyu Chen, Zhiqiang Rao, Shaojun Li, Hengchao Shang, Jiaxin Guo, Lizhi Lei, Hao Yang, and Yanfei Jiang. 2023. HW-TSC’s Submissions to the WMT23 Discourse-Level Literary Translation Shared Task. In Proceedings of the Eighth Conference on Machine Translation, pages 302–306, Singapore. Association for Computational Linguistics.
Cite (Informal):
HW-TSC’s Submissions to the WMT23 Discourse-Level Literary Translation Shared Task (Xie et al., WMT 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.wmt-1.32.pdf