Abstract
Document-level Relation Extraction (DocRE) aims to identify and extract relations between entities from entire documents. Unlike sentence-level relation extraction, document-level relation extraction requires considering cross-sentence contextual information to capture complex relations expressed across multiple sentences or paragraphs. Previous research has shown that the application of Logical Rules (LR) can improve the efficiency of relation extraction. However, identifying and extracting these logical rules require significant memory consumption, and the obtained logical rules lack specificity. To address this issue, we propose a novel approach in this paper (ELRSD (The implementation code for ELRSD can be obtained from the GitHub link: https://github.com/maoxuxu/elrsd.)), Based on the Self-Distillation (SD) training process, utilize Multi-head Graph Convolution Networks and Transformer synergistic learning to improve the extraction of logical rules, thereby enhancing the extraction of entity relations. We conduct comprehensive experiments on three popular benchmark datasets. The results obtained under the same experimental settings indicate that our method achieved state-of-the-art performance.
Y. Mao and T. Cui—Contributed equally.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
Please refer to ELRSD: https://codalab.lisn.upsaclay.fr/competitions/365#results.
References
Choi, M., Lim, H., Choo, J.: Prism: enhancing low-resource document-level relation extraction with relation-aware score calibration. In: ACL-IJCNLP, pp. 39–47 (2023)
Dai, D., Ren, J., Zeng, S., Chang, B., Sui, Z.: Coarse-to-fine entity representations for document-level relation extraction. In: NLPCC, pp. 185–197 (2023)
Fan, S., Mo, S., Niu, J.: Boosting document-level relation extraction by mining and injecting logical rules. In: Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pp. 10311–10323 (2022)
Guo, Z., Zhang, Y., Lu, W.: Attention guided graph convolutional networks for relation extraction. In: ACL, pp. 241–251 (2019)
Huang, K., Qi, P., Wang, G., Ma, T., Huang, J.: Entity and evidence guided document-level relation extraction. In: ACL(RepL4NLP-2021), pp. 307–315 (2021)
Huang, Q., Hao, S., Ye, Y., Zhu, S., Feng, Y., Zhao, D.: Does recommend-revise produce reliable annotations? an analysis on missing instances in docred. In: ACL, pp. 6241–6252 (2022)
Jia, R., Wong, C., Poon, H.: Document-level n-ary relation extraction with multiscale representation learning. In: NAACL-HLT, pp. 3693–3704 (2019)
Jiang, F., Niu, J., Mo, S., Fan, S.: Key mention pairs guided document-level relation extraction. In: COLING, pp. 1904–1914 (2022)
Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: ICLR (2016)
Li, J., Xu, K., Li, F., Fei, H., Ren, Y., Ji, D.: Mrn: a locally and globally mention-based reasoning network for document-level relation extraction. In: ACL-IJCNLP, pp. 1359–1370 (2021)
Nan, G., Guo, Z., Sekulić, I., Lu, W.: Reasoning with latent structure refinement for document-level relation extraction. In: ACL, pp. 1546–1557 (2020)
Nasar, Z., Jaffry, S.W., Malik, M.K.: Named entity recognition and relation extraction: state-of-the-art. ACM Comput. Surv. (CSUR) 54(1), 1–39 (2021)
Peng, N., Poon, H., Quirk, C., Toutanova, K., Yih, W.T.: Cross-sentence n-ary relation extraction with graph lstms. TACL 5, 101–115 (2017)
Peng, X., Zhang, C., Xu, K.: Document-level relation extraction via subgraph reasoning. In: IJCAI, pp. 4331–4337 (2022)
Ru, D., et al.: Learning logic rules for document-level relation extraction. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pp. 1239–1250 (2021)
Tan, Q., He, R., Bing, L., Ng, H.T.: Document-level relation extraction with adaptive focal loss and knowledge distillation. In: ACL, pp. 1672–1681 (2022)
Tan, Q., Xu, L., Bing, L., Ng, H.T., Aljunied, S.M.: Revisiting docred-addressing the false negative problem in relation extraction. In: EMNLP, pp. 8472–8487 (2022)
Wang, H., Focke, C., Sylvester, R., Mishra, N., Wang, W.: Fine-tune bert for docred with two-step process. arXiv preprint arXiv:1909.11898 (2019)
Xiao, Y., Zhang, Z., Mao, Y., Yang, C., Han, J.: Sais: supervising and augmenting intermediate steps for document-level relation extraction. In: NAACL, pp. 2395–2409 (2022)
Xie, Y., Shen, J., Li, S., Mao, Y., Han, J.: Eider: empowering document-level relation extraction with efficient evidence extraction and inference-stage fusion. In: ACL, pp. 257–268 (2022)
Xu, B., Wang, Q., Lyu, Y., Zhu, Y., Mao, Z.: Entity structure within and throughout: modeling mention dependencies for document-level relation extraction. In: AAAI, pp. 14149–14157 (2021)
Yao, Y., et al.: Docred: a large-scale document-level relation extraction dataset. In: ACL, pp. 764–777 (2019)
Ye, D., et al.: Coreferential reasoning learning for language representation. In: EMNLP, pp. 7170–7186 (2020)
Zaporojets, K., Deleu, J., Develder, C., Demeester, T.: Dwie: an entity-centric dataset for multi-task document-level information extraction. Inf. Process. Manag. 58(4), 102563 (2021)
Zeng, S., Xu, R., Chang, B., Li, L.: Double graph based reasoning for document-level relation extraction. In: EMNLP, pp. 1630–1640 (2020)
Zhang, L., Min, Z., Su, J., Yu, P., Wang, A., Chen, Y.: Exploring effective inter-encoder semantic interaction for document-level relation extraction. In: IJCAI, pp. 5278–5286 (2023)
Zhang, L., et al.: Exploring self-distillation based relational reasoning training for document-level relation extraction. In: AAAI, pp. 13967–13975 (2023)
Zhang, N., et al.: Document-level relation extraction as semantic segmentation. In: IJCAI, pp. 3999–4006 (2021)
Zhang, Y., Qi, P., Manning, C.D.: Graph convolution over pruned dependency trees improves relation extraction. In: EMNLP, pp. 2205–2215 (2018)
Zhou, W., Huang, K., Ma, T., Huang, J.: Document-level relation extraction with adaptive thresholding and localized context pooling. In: AAAI, pp. 14612–14620 (2021)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2025 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Mao, Y., Cui, T., Ding, Y. (2025). Enhancing Logical Rules Based on Self-Distillation for Document-Level Relation Extraction. In: Wong, D.F., Wei, Z., Yang, M. (eds) Natural Language Processing and Chinese Computing. NLPCC 2024. Lecture Notes in Computer Science(), vol 15359. Springer, Singapore. https://doi.org/10.1007/978-981-97-9431-7_31
Download citation
DOI: https://doi.org/10.1007/978-981-97-9431-7_31
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-97-9430-0
Online ISBN: 978-981-97-9431-7
eBook Packages: Computer ScienceComputer Science (R0)