Nothing Special   »   [go: up one dir, main page]

ALLSH: Active Learning Guided by Local Sensitivity and Hardness

Shujian Zhang, Chengyue Gong, Xingchao Liu, Pengcheng He, Weizhu Chen, Mingyuan Zhou


Abstract
Active learning, which effectively collects informative unlabeled data for annotation, reduces the demand for labeled data. In this work, we propose to retrieve unlabeled samples with a local sensitivity and hardness-aware acquisition function. The proposed method generates data copies through local perturbations and selects data points whose predictive likelihoods diverge the most from their copies. We further empower our acquisition function by injecting the select-worst case perturbation. Our method achieves consistent gains over the commonly used active learning strategies in various classification tasks. Furthermore, we observe consistent improvements over the baselines on the study of prompt selection in prompt-based few-shot learning. These experiments demonstrate that our acquisition guided by local sensitivity and hardness can be effective and beneficial for many NLP tasks.
Anthology ID:
2022.findings-naacl.99
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1328–1342
Language:
URL:
https://aclanthology.org/2022.findings-naacl.99
DOI:
10.18653/v1/2022.findings-naacl.99
Bibkey:
Cite (ACL):
Shujian Zhang, Chengyue Gong, Xingchao Liu, Pengcheng He, Weizhu Chen, and Mingyuan Zhou. 2022. ALLSH: Active Learning Guided by Local Sensitivity and Hardness. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 1328–1342, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
ALLSH: Active Learning Guided by Local Sensitivity and Hardness (Zhang et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.99.pdf
Video:
 https://aclanthology.org/2022.findings-naacl.99.mp4
Data
AG NewsGLUEIMDb Movie ReviewsQNLISSTSST-2