Nothing Special   »   [go: up one dir, main page]

Rethinking Negative Sampling for Handling Missing Entity Annotations

Yangming Li, Lemao Liu, Shuming Shi


Abstract
Negative sampling is highly effective in handling missing annotations for named entity recognition (NER). One of our contributions is an analysis on how it makes sense through introducing two insightful concepts: missampling and uncertainty. Empirical studies show low missampling rate and high uncertainty are both essential for achieving promising performances with negative sampling. Based on the sparsity of named entities, we also theoretically derive a lower bound for the probability of zero missampling rate, which is only relevant to sentence length. The other contribution is an adaptive and weighted sampling distribution that further improves negative sampling via our former analysis. Experiments on synthetic datasets and well-annotated datasets (e.g., CoNLL-2003) show that our proposed approach benefits negative sampling in terms of F1 score and loss convergence. Besides, models with improved negative sampling have achieved new state-of-the-art results on real-world datasets (e.g., EC).
Anthology ID:
2022.acl-long.497
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7188–7197
Language:
URL:
https://aclanthology.org/2022.acl-long.497
DOI:
10.18653/v1/2022.acl-long.497
Bibkey:
Cite (ACL):
Yangming Li, Lemao Liu, and Shuming Shi. 2022. Rethinking Negative Sampling for Handling Missing Entity Annotations. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 7188–7197, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Rethinking Negative Sampling for Handling Missing Entity Annotations (Li et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.497.pdf