Nothing Special   »   [go: up one dir, main page]

Skip to main content

Enhancing Logical Rules Based on Self-Distillation for Document-Level Relation Extraction

  • Conference paper
  • First Online:
Natural Language Processing and Chinese Computing (NLPCC 2024)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 15359))

  • 30 Accesses

Abstract

Document-level Relation Extraction (DocRE) aims to identify and extract relations between entities from entire documents. Unlike sentence-level relation extraction, document-level relation extraction requires considering cross-sentence contextual information to capture complex relations expressed across multiple sentences or paragraphs. Previous research has shown that the application of Logical Rules (LR) can improve the efficiency of relation extraction. However, identifying and extracting these logical rules require significant memory consumption, and the obtained logical rules lack specificity. To address this issue, we propose a novel approach in this paper (ELRSD (The implementation code for ELRSD can be obtained from the GitHub link: https://github.com/maoxuxu/elrsd.)), Based on the Self-Distillation (SD) training process, utilize Multi-head Graph Convolution Networks and Transformer synergistic learning to improve the extraction of logical rules, thereby enhancing the extraction of entity relations. We conduct comprehensive experiments on three popular benchmark datasets. The results obtained under the same experimental settings indicate that our method achieved state-of-the-art performance.

Y. Mao and T. Cui—Contributed equally.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 74.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    Please refer to ELRSD: https://codalab.lisn.upsaclay.fr/competitions/365#results.

References

  1. Choi, M., Lim, H., Choo, J.: Prism: enhancing low-resource document-level relation extraction with relation-aware score calibration. In: ACL-IJCNLP, pp. 39–47 (2023)

    Google Scholar 

  2. Dai, D., Ren, J., Zeng, S., Chang, B., Sui, Z.: Coarse-to-fine entity representations for document-level relation extraction. In: NLPCC, pp. 185–197 (2023)

    Google Scholar 

  3. Fan, S., Mo, S., Niu, J.: Boosting document-level relation extraction by mining and injecting logical rules. In: Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pp. 10311–10323 (2022)

    Google Scholar 

  4. Guo, Z., Zhang, Y., Lu, W.: Attention guided graph convolutional networks for relation extraction. In: ACL, pp. 241–251 (2019)

    Google Scholar 

  5. Huang, K., Qi, P., Wang, G., Ma, T., Huang, J.: Entity and evidence guided document-level relation extraction. In: ACL(RepL4NLP-2021), pp. 307–315 (2021)

    Google Scholar 

  6. Huang, Q., Hao, S., Ye, Y., Zhu, S., Feng, Y., Zhao, D.: Does recommend-revise produce reliable annotations? an analysis on missing instances in docred. In: ACL, pp. 6241–6252 (2022)

    Google Scholar 

  7. Jia, R., Wong, C., Poon, H.: Document-level n-ary relation extraction with multiscale representation learning. In: NAACL-HLT, pp. 3693–3704 (2019)

    Google Scholar 

  8. Jiang, F., Niu, J., Mo, S., Fan, S.: Key mention pairs guided document-level relation extraction. In: COLING, pp. 1904–1914 (2022)

    Google Scholar 

  9. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: ICLR (2016)

    Google Scholar 

  10. Li, J., Xu, K., Li, F., Fei, H., Ren, Y., Ji, D.: Mrn: a locally and globally mention-based reasoning network for document-level relation extraction. In: ACL-IJCNLP, pp. 1359–1370 (2021)

    Google Scholar 

  11. Nan, G., Guo, Z., Sekulić, I., Lu, W.: Reasoning with latent structure refinement for document-level relation extraction. In: ACL, pp. 1546–1557 (2020)

    Google Scholar 

  12. Nasar, Z., Jaffry, S.W., Malik, M.K.: Named entity recognition and relation extraction: state-of-the-art. ACM Comput. Surv. (CSUR) 54(1), 1–39 (2021)

    Article  Google Scholar 

  13. Peng, N., Poon, H., Quirk, C., Toutanova, K., Yih, W.T.: Cross-sentence n-ary relation extraction with graph lstms. TACL 5, 101–115 (2017)

    Article  Google Scholar 

  14. Peng, X., Zhang, C., Xu, K.: Document-level relation extraction via subgraph reasoning. In: IJCAI, pp. 4331–4337 (2022)

    Google Scholar 

  15. Ru, D., et al.: Learning logic rules for document-level relation extraction. In: Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pp. 1239–1250 (2021)

    Google Scholar 

  16. Tan, Q., He, R., Bing, L., Ng, H.T.: Document-level relation extraction with adaptive focal loss and knowledge distillation. In: ACL, pp. 1672–1681 (2022)

    Google Scholar 

  17. Tan, Q., Xu, L., Bing, L., Ng, H.T., Aljunied, S.M.: Revisiting docred-addressing the false negative problem in relation extraction. In: EMNLP, pp. 8472–8487 (2022)

    Google Scholar 

  18. Wang, H., Focke, C., Sylvester, R., Mishra, N., Wang, W.: Fine-tune bert for docred with two-step process. arXiv preprint arXiv:1909.11898 (2019)

  19. Xiao, Y., Zhang, Z., Mao, Y., Yang, C., Han, J.: Sais: supervising and augmenting intermediate steps for document-level relation extraction. In: NAACL, pp. 2395–2409 (2022)

    Google Scholar 

  20. Xie, Y., Shen, J., Li, S., Mao, Y., Han, J.: Eider: empowering document-level relation extraction with efficient evidence extraction and inference-stage fusion. In: ACL, pp. 257–268 (2022)

    Google Scholar 

  21. Xu, B., Wang, Q., Lyu, Y., Zhu, Y., Mao, Z.: Entity structure within and throughout: modeling mention dependencies for document-level relation extraction. In: AAAI, pp. 14149–14157 (2021)

    Google Scholar 

  22. Yao, Y., et al.: Docred: a large-scale document-level relation extraction dataset. In: ACL, pp. 764–777 (2019)

    Google Scholar 

  23. Ye, D., et al.: Coreferential reasoning learning for language representation. In: EMNLP, pp. 7170–7186 (2020)

    Google Scholar 

  24. Zaporojets, K., Deleu, J., Develder, C., Demeester, T.: Dwie: an entity-centric dataset for multi-task document-level information extraction. Inf. Process. Manag. 58(4), 102563 (2021)

    Article  Google Scholar 

  25. Zeng, S., Xu, R., Chang, B., Li, L.: Double graph based reasoning for document-level relation extraction. In: EMNLP, pp. 1630–1640 (2020)

    Google Scholar 

  26. Zhang, L., Min, Z., Su, J., Yu, P., Wang, A., Chen, Y.: Exploring effective inter-encoder semantic interaction for document-level relation extraction. In: IJCAI, pp. 5278–5286 (2023)

    Google Scholar 

  27. Zhang, L., et al.: Exploring self-distillation based relational reasoning training for document-level relation extraction. In: AAAI, pp. 13967–13975 (2023)

    Google Scholar 

  28. Zhang, N., et al.: Document-level relation extraction as semantic segmentation. In: IJCAI, pp. 3999–4006 (2021)

    Google Scholar 

  29. Zhang, Y., Qi, P., Manning, C.D.: Graph convolution over pruned dependency trees improves relation extraction. In: EMNLP, pp. 2205–2215 (2018)

    Google Scholar 

  30. Zhou, W., Huang, K., Ma, T., Huang, J.: Document-level relation extraction with adaptive thresholding and localized context pooling. In: AAAI, pp. 14612–14620 (2021)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yanxu Mao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2025 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Mao, Y., Cui, T., Ding, Y. (2025). Enhancing Logical Rules Based on Self-Distillation for Document-Level Relation Extraction. In: Wong, D.F., Wei, Z., Yang, M. (eds) Natural Language Processing and Chinese Computing. NLPCC 2024. Lecture Notes in Computer Science(), vol 15359. Springer, Singapore. https://doi.org/10.1007/978-981-97-9431-7_31

Download citation

  • DOI: https://doi.org/10.1007/978-981-97-9431-7_31

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-97-9430-0

  • Online ISBN: 978-981-97-9431-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics