Nothing Special   »   [go: up one dir, main page]

A Label-Aware Autoregressive Framework for Cross-Domain NER

Jinpeng Hu, He Zhao, Dan Guo, Xiang Wan, Tsung-Hui Chang


Abstract
Cross-domain named entity recognition (NER) aims to borrow the entity information from the source domain to help the entity recognition in the target domain with limited labeled data. Despite the promising performance of existing approaches, most of them focus on reducing the discrepancy of token representation between source and target domains, while the transfer of the valuable label information is often not explicitly considered or even ignored. Therefore, we propose a novel autoregressive framework to advance cross-domain NER by first enhancing the relationship between labels and tokens and then further improving the transferability of label information. Specifically, we associate each label with an embedding vector, and for each token, we utilize a bidirectional LSTM (Bi-LSTM) to encode the labels of its previous tokens for modeling internal context information and label dependence. Afterward, we propose a Bi-Attention module that merges the token representation from a pre-trained model and the label features from the Bi-LSTM as the label-aware information, which is concatenated to the token representation to facilitate cross-domain NER. In doing so, label information contained in the embedding vectors can be effectively transferred to the target domain, and Bi-LSTM can further model the label relationship among different domains by pre-train and then fine-tune setting. Experimental results on several datasets confirm the effectiveness of our model, where our model achieves significant improvements over the state of the arts.
Anthology ID:
2022.findings-naacl.171
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2222–2232
Language:
URL:
https://aclanthology.org/2022.findings-naacl.171
DOI:
10.18653/v1/2022.findings-naacl.171
Bibkey:
Cite (ACL):
Jinpeng Hu, He Zhao, Dan Guo, Xiang Wan, and Tsung-Hui Chang. 2022. A Label-Aware Autoregressive Framework for Cross-Domain NER. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 2222–2232, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
A Label-Aware Autoregressive Framework for Cross-Domain NER (Hu et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.171.pdf
Software:
 2022.findings-naacl.171.software.zip
Video:
 https://aclanthology.org/2022.findings-naacl.171.mp4
Code
 jinpeng01/laner