Nothing Special   »   [go: up one dir, main page]

Revisiting the Context Window for Cross-lingual Word Embeddings

Ryokan Ri, Yoshimasa Tsuruoka


Abstract
Existing approaches to mapping-based cross-lingual word embeddings are based on the assumption that the source and target embedding spaces are structurally similar. The structures of embedding spaces largely depend on the co-occurrence statistics of each word, which the choice of context window determines. Despite this obvious connection between the context window and mapping-based cross-lingual embeddings, their relationship has been underexplored in prior work. In this work, we provide a thorough evaluation, in various languages, domains, and tasks, of bilingual embeddings trained with different context windows. The highlight of our findings is that increasing the size of both the source and target window sizes improves the performance of bilingual lexicon induction, especially the performance on frequent nouns.
Anthology ID:
2020.acl-main.94
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
995–1005
Language:
URL:
https://aclanthology.org/2020.acl-main.94
DOI:
10.18653/v1/2020.acl-main.94
Bibkey:
Cite (ACL):
Ryokan Ri and Yoshimasa Tsuruoka. 2020. Revisiting the Context Window for Cross-lingual Word Embeddings. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 995–1005, Online. Association for Computational Linguistics.
Cite (Informal):
Revisiting the Context Window for Cross-lingual Word Embeddings (Ri & Tsuruoka, ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.94.pdf
Video:
 http://slideslive.com/38928903
Data
MLDoc