Nothing Special   »   [go: up one dir, main page]

Serial Contrastive Knowledge Distillation for Continual Few-shot Relation Extraction

Xinyi Wang, Zitao Wang, Wei Hu


Abstract
Continual few-shot relation extraction (RE) aims to continuously train a model for new relations with few labeled training data, of which the major challenges are the catastrophic forgetting of old relations and the overfitting caused by data sparsity. In this paper, we propose a new model, namely SCKD, to accomplish the continual few-shot RE task. Specifically, we design serial knowledge distillation to preserve the prior knowledge from previous models and conduct contrastive learning with pseudo samples to keep the representations of samples in different relations sufficiently distinguishable. Our experiments on two benchmark datasets validate the effectiveness of SCKD for continual few-shot RE and its superiority in knowledge transfer and memory utilization over state-of-the-art models.
Anthology ID:
2023.findings-acl.804
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12693–12706
Language:
URL:
https://aclanthology.org/2023.findings-acl.804
DOI:
10.18653/v1/2023.findings-acl.804
Bibkey:
Cite (ACL):
Xinyi Wang, Zitao Wang, and Wei Hu. 2023. Serial Contrastive Knowledge Distillation for Continual Few-shot Relation Extraction. In Findings of the Association for Computational Linguistics: ACL 2023, pages 12693–12706, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Serial Contrastive Knowledge Distillation for Continual Few-shot Relation Extraction (Wang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.804.pdf