Nothing Special   »   [go: up one dir, main page]

GraphCache: Message Passing as Caching for Sentence-Level Relation Extraction

Yiwei Wang, Muhao Chen, Wenxuan Zhou, Yujun Cai, Yuxuan Liang, Bryan Hooi


Abstract
Entity types and textual context are essential properties for sentence-level relation extraction (RE). Existing work only encodes these properties within individual instances, which limits the performance of RE given the insufficient features in a single sentence. In contrast, we model these properties from the whole dataset and use the dataset-level information to enrich the semantics of every instance. We propose the GraphCache (Graph Neural Network as Caching) module, that propagates the features across sentences to learn better representations for RE. GraphCache aggregates the features from sentences in the whole dataset to learn global representations of properties, and use them to augment the local features within individual sentences. The global property features act as dataset-level prior knowledge for RE, and a complement to the sentence-level features. Inspired by the classical caching technique in computer systems, we develop GraphCache to update the property representations in an online manner. Overall, GraphCache yields significant effectiveness gains on RE and enables efficient message passing across all sentences in the dataset.
Anthology ID:
2022.findings-naacl.128
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1698–1708
Language:
URL:
https://aclanthology.org/2022.findings-naacl.128
DOI:
10.18653/v1/2022.findings-naacl.128
Bibkey:
Cite (ACL):
Yiwei Wang, Muhao Chen, Wenxuan Zhou, Yujun Cai, Yuxuan Liang, and Bryan Hooi. 2022. GraphCache: Message Passing as Caching for Sentence-Level Relation Extraction. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 1698–1708, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
GraphCache: Message Passing as Caching for Sentence-Level Relation Extraction (Wang et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.128.pdf
Data
SemEval-2010 Task-8