Nothing Special   »   [go: up one dir, main page]

Learning from Sibling Mentions with Scalable Graph Inference in Fine-Grained Entity Typing

Yi Chen, Jiayang Cheng, Haiyun Jiang, Lemao Liu, Haisong Zhang, Shuming Shi, Ruifeng Xu


Abstract
In this paper, we firstly empirically find that existing models struggle to handle hard mentions due to their insufficient contexts, which consequently limits their overall typing performance. To this end, we propose to exploit sibling mentions for enhancing the mention representations. Specifically, we present two different metrics for sibling selection and employ an attentive graph neural network to aggregate information from sibling mentions. The proposed graph model is scalable in that unseen test mentions are allowed to be added as new nodes for inference. Exhaustive experiments demonstrate the effectiveness of our sibling learning strategy, where our model outperforms ten strong baselines. Moreover, our experiments indeed prove the superiority of sibling mentions in helping clarify the types for hard mentions.
Anthology ID:
2022.acl-long.147
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2076–2087
Language:
URL:
https://aclanthology.org/2022.acl-long.147
DOI:
10.18653/v1/2022.acl-long.147
Bibkey:
Cite (ACL):
Yi Chen, Jiayang Cheng, Haiyun Jiang, Lemao Liu, Haisong Zhang, Shuming Shi, and Ruifeng Xu. 2022. Learning from Sibling Mentions with Scalable Graph Inference in Fine-Grained Entity Typing. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2076–2087, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Learning from Sibling Mentions with Scalable Graph Inference in Fine-Grained Entity Typing (Chen et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.147.pdf
Software:
 2022.acl-long.147.software.zip