Nothing Special   »   [go: up one dir, main page]

Making Pre-trained Language Models Better Continual Few-Shot Relation Extractors

Shengkun Ma, Jiale Han, Yi Liang, Bo Cheng


Abstract
Continual Few-shot Relation Extraction (CFRE) is a practical problem that requires the model to continuously learn novel relations while avoiding forgetting old ones with few labeled training data. The primary challenges are catastrophic forgetting and overfitting. This paper harnesses prompt learning to explore the implicit capabilities of pre-trained language models to address the above two challenges, thereby making language models better continual few-shot relation extractors. Specifically, we propose a Contrastive Prompt Learning framework, which designs prompt representation to acquire more generalized knowledge that can be easily adapted to old and new categories, and margin-based contrastive learning to focus more on hard samples, therefore alleviating catastrophic forgetting and overfitting issues. To further remedy overfitting in low-resource scenarios, we introduce an effective memory augmentation strategy that employs well-crafted prompts to guide ChatGPT in generating diverse samples. Extensive experiments demonstrate that our method outperforms state-of-the-art methods by a large margin and significantly mitigates catastrophic forgetting and overfitting in low-resource scenarios.
Anthology ID:
2024.lrec-main.957
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
10970–10983
Language:
URL:
https://aclanthology.org/2024.lrec-main.957
DOI:
Bibkey:
Cite (ACL):
Shengkun Ma, Jiale Han, Yi Liang, and Bo Cheng. 2024. Making Pre-trained Language Models Better Continual Few-Shot Relation Extractors. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 10970–10983, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Making Pre-trained Language Models Better Continual Few-Shot Relation Extractors (Ma et al., LREC-COLING 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.lrec-main.957.pdf