As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
Conventional distributed word representation learning, which learns a single vector for each word, is unable to represent different meanings of polysemous words. To address this issue, a number of approaches were proposed to model individual word senses in recent years. However, most of these sense representations are hard to be integrated into downstream tasks. In this paper, we propose a knowledge-based method to learn word sense representations that can offer effective support in downstream tasks. More specifically, we propose to capture the semantic information of prior human knowledge from sememes, the minimum semantic units of meaning, to build global sense context vectors and perform a reliable soft word sense disambiguation for polysemous words. We extend the framework of Skip-gram model with a contextual attention mechanism to learn an individual embedding for each sense. The intrinsic experimental results show that our proposed method can capture the distinct and exact meanings of senses and outperform previous work on the classic word similarity task. The extrinsic experiment and further analysis show that our sense embeddings can be utilized to effectively improve performance and mitigate the impact of polysemy in multiple real-word downstream tasks.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.