Nothing Special   »   [go: up one dir, main page]

Global Relation Embedding for Relation Extraction

Yu Su, Honglei Liu, Semih Yavuz, Izzeddin Gür, Huan Sun, Xifeng Yan


Abstract
We study the problem of textual relation embedding with distant supervision. To combat the wrong labeling problem of distant supervision, we propose to embed textual relations with global statistics of relations, i.e., the co-occurrence statistics of textual and knowledge base relations collected from the entire corpus. This approach turns out to be more robust to the training noise introduced by distant supervision. On a popular relation extraction dataset, we show that the learned textual relation embedding can be used to augment existing relation extraction models and significantly improve their performance. Most remarkably, for the top 1,000 relational facts discovered by the best existing model, the precision can be improved from 83.9% to 89.3%.
Anthology ID:
N18-1075
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Marilyn Walker, Heng Ji, Amanda Stent
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
820–830
Language:
URL:
https://aclanthology.org/N18-1075
DOI:
10.18653/v1/N18-1075
Bibkey:
Cite (ACL):
Yu Su, Honglei Liu, Semih Yavuz, Izzeddin Gür, Huan Sun, and Xifeng Yan. 2018. Global Relation Embedding for Relation Extraction. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), pages 820–830, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Global Relation Embedding for Relation Extraction (Su et al., NAACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/N18-1075.pdf
Code
 ppuliu/GloRE +  additional community code