Nothing Special   »   [go: up one dir, main page]

Skip to main content

Advertisement

Log in

Improving distant supervision relation extraction with entity-guided enhancement feature

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Selective attention in distant supervision extraction relation is advantageous to deal with incorrectly labeled sentences in a bag, but it does not help in cases where many sentence bags consist of only one sentence. To resolve the deficiencies, we propose an entity-guided enhancement feature neural network for distant supervision relation extraction. We discover that key relation features are typically found in both significant words and phrases, which can be captured by entity guidance. We first develop an entity-directed attention that measures the relevance between entities and two levels of semantic units from word and phrase to capture reliable relation features, which are used to enhance the entity representations. Furthermore, two multi-level augmented entity representations are transformed to a relation representation via a linear layer. Then we adopt a semantic fusion layer to fuse multiple semantic representations such as the sentence representation encoded by piecewise convolutional neural network, two multi-level augmented entity representations, and the relation representation to get final enhanced sentence representation. Finally, with the guidance of the relation representations, we introduce a gate pooling strategy to generate a bag-level representation and address the one-sentence bag problem occurring in selective attention. Extensive experiments demonstrate that our method outperforms the state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data availability

Data available on request from the authors.

Notes

  1. http://iesl.cs.umass.edu/riedel/ecml/.

References

  1. Guo Z, Zhang Y, Lu W (2019) Attention guided graph convolutional networks for relation extraction. In: Proceedings of the 57th annual meeting of the Association for Computational Linguistics. Association for Computational Linguistics, Florence, pp 241–251

  2. Song S, Sun Y, Di Q (2019) Multiple order semantic relation extraction. Neural Comput Appl 31(9):4563–4576

    Article  Google Scholar 

  3. Yen A-Z, Huang H-H, Chen H-H (2019) Personal knowledge base construction from text-based lifelogs. In: Proceedings of the 42nd international ACM SIGIR conference on research and development in information retrieval, SIGIR’19. Association for Computing Machinery, New York, pp 185–194

  4. Lei K, Zhang J, Xie Y, Wen D, Chen D, Yang M, Shen Y (2020) Path-based reasoning with constrained type attention for knowledge graph completion. Neural Comput Appl 32(11):6957–6966

    Article  Google Scholar 

  5. Saxena A, Tripathi A, Talukdar P (2020) Improving multi-hop question answering over knowledge graphs using knowledge base embeddings. In: Proceedings of the 58th annual meeting of the Association for Computational Linguistics, Online. Association for Computational Linguistics, pp 4498–4507

  6. Li X, Yin F, Sun Z, Li X, Yuan A, Chai D, Zhou M, Li J (2019) Entity-relation extraction as multi-turn question answering. In: Proceedings of the 57th annual meeting of the Association for Computational Linguistics. Association for Computational Linguistics, Florence, pp 1340–1350

  7. Mintz M, Bills S, Snow R, Jurafsky D (2009) Distant supervision for relation extraction without labeled data. In: Proceedings of the joint conference of the 47th annual meeting of the ACL and the 4th international joint conference on natural language processing of the AFNLP. Association for Computational Linguistics, Suntec, pp 1003–1011

  8. Riedel S, Yao L, McCallum A (2010) Modeling relations and their mentions without labeled text. In: Balcázar JL, Bonchi F, Gionis A, Sebag M (eds) Machine learning and knowledge discovery in databases. Springer, Berlin, pp 148–163

    Chapter  Google Scholar 

  9. Lin Y, Shen S, Liu Z, Luan H, Sun M (2016) Neural relation extraction with selective attention over instances. In: Proceedings of the 54th annual meeting of the Association for Computational Linguistics (volume 1: long papers). Association for Computational Linguistics, Berlin, pp 2124–2133

  10. Han X, Yu P, Liu Z, Sun M, Li P (2018) Hierarchical relation extraction with coarse-to-fine grained attention. In: Proceedings of the 2018 conference on empirical methods in natural language processing. Association for Computational Linguistics, Brussels, pp 2236–2245

  11. Ye Z-X, Ling Z-H (2019) Distant supervision relation extraction with intra-bag and inter-bag attentions. In: Proceedings of the 2019 conference of the North American chapter of the Association for Computational Linguistics: human language technologies, volume 1 (long and short papers). Association for Computational Linguistics, Minneapolis, pp 2810–2819

  12. Hu L, Zhang L, Shi C, Nie L, Guan W, Yang C (2019) Improving distantly-supervised relation extraction with joint label embedding. In: Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing (EMNLP-IJCNLP). Association for Computational Linguistics, Hong Kong, pp 3821–3829

  13. Wen H, Zhu X, Zhang L, Li F (2020) A gated piecewise CNN with entity-aware enhancement for distantly supervised relation extraction. Inf Process Manag 57(6):102373

    Article  Google Scholar 

  14. Li Y, Long G, Shen T, Zhou T, Yao L, Huo H, Jiang J (2020) Self-attention enhanced selective gate with entity-aware embedding for distantly supervised relation extraction. Proc AAAI Conf Artif Intell 34(05):8269–8276

    Google Scholar 

  15. Zhang X, Liu T, Li P, Jia W, Zhao H (2021) Robust neural relation extraction via multi-granularity noises reduction. IEEE Trans Knowl Data Eng 33(9):3297–3310

    Article  Google Scholar 

  16. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I (2017) Attention is all you need. In: Proceedings of the 31st international conference on neural information processing systems, NIPS’17. Curran Associates Inc., Red Hook, pp 6000–6010

  17. Zeng D, Liu K, Chen Y, Zhao J (2015) Distant supervision for relation extraction via piecewise convolutional neural networks. In: Proceedings of the 2015 conference on empirical methods in natural language processing. Association for Computational Linguistics, Lisbon, pp 1753–1762

  18. Zelenko D, Aone C, Richardella A (2003) Kernel methods for relation extraction. J Mach Learn Res 3(Feb):1083–1106

    MathSciNet  MATH  Google Scholar 

  19. Culotta A, Sorensen J (2004) Dependency tree kernels for relation extraction. In: Proceedings of the 42nd annual meeting of the Association for Computational Linguistics (ACL-04), Barcelona, pp 423–429

  20. Mooney RJ, Bunescu RC (2006) Subsequence kernels for relation extraction. In: Advances in neural information processing systems, pp 171–178

  21. Hoffmann R, Zhang C, Ling X, Zettlemoyer L, Weld DS (2011) Knowledge-based weak supervision for information extraction of overlapping relations. In: Proceedings of the 49th annual meeting of the Association for Computational Linguistics: human language technologies. Association for Computational Linguistics, Portland, pp 541–550

  22. Surdeanu M, Tibshirani J, Nallapati R, Manning CD (2012) Multi-instance multi-label learning for relation extraction. In: Proceedings of the 2012 joint conference on empirical methods in natural language processing and computational natural language learning. Association for Computational Linguistics, Jeju Island, pp 455–465

  23. Zhang N, Deng S, Sun Z, Wang G, Chen X, Zhang W, Chen H (2019) Long-tail relation extraction via knowledge graph embeddings and graph convolution networks. In: Proceedings of the 2019 conference of the North American chapter of the Association for Computational Linguistics: human language technologies, volume 1 (long and short papers). Association for Computational Linguistics, Minneapolis, pp 3016–3025

  24. Huang Y, Du J (2019) Self-attention enhanced CNNs and collaborative curriculum learning for distantly supervised relation extraction. In: Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing (EMNLP-IJCNLP). Association for Computational Linguistics, Hong Kong, pp 389–398

  25. LeCun Y, Bengio Y, Hinton G (2015) Deep learning. Nature 521(7553):436–444

    Article  Google Scholar 

  26. Liu T, Wang K, Chang B, Sui Z (2017) A soft-label method for noise-tolerant distantly supervised relation extraction. In: Proceedings of the 2017 conference on empirical methods in natural language processing. Association for Computational Linguistics, Copenhagen, pp 1790–1795

  27. Kipf TN, Welling M (2017) Semi-supervised classification with graph convolutional networks

  28. Liu Y, Liu K, Xu L, Zhao J (2014) Exploring fine-grained entity type constraints for distantly supervised relation extraction. In: Proceedings of COLING 2014, the 25th international conference on computational linguistics: technical papers. Dublin City University and Association for Computational Linguistics, Dublin, pp 2107–2116

  29. Ji G, Liu K, He S, Zhao J (2017) Distant supervision for relation extraction with sentence-level attention and entity descriptions. In: Proceedings of the AAAI conference on artificial intelligence

  30. Vashishth S, Joshi R, Prayaga SS, Bhattacharyya C, Talukdar P (2018) RESIDE: improving distantly-supervised neural relation extraction using side information. In: Proceedings of the 2018 conference on empirical methods in natural language processing. Association for Computational Linguistics, Brussels, pp 1257–1266

  31. Lei K, Chen D, Li Y, Du N, Yang M, Fan W, Shen Y (2018) Cooperative denoising for distantly supervised relation extraction. In: Proceedings of the 27th international conference on computational linguistics. Association for Computational Linguistics, Santa Fe, pp 426–436

  32. Cao X, Yang J, Meng X (2020) Partial domain adaptation for relation extraction based on adversarial learning. In: Harth A, Kirrane S, Ngonga Ngomo A-C, Paulheim H, Rula A, Gentile AL, Haase P, Cochez M (eds) The semantic web. Springer, Cham, pp 89–104

    Chapter  Google Scholar 

  33. Feng J, Huang M, Zhao L, Yang Y, Zhu X (2018) Reinforcement learning for relation classification from noisy data

  34. Takanobu R, Zhang T, Liu J, Huang M (2019) A hierarchical framework for relation extraction with reinforcement learning. In: Proceedings of the AAAI conference on artificial intelligence, vol 33, pp 7072–7079

  35. Zeng D, Liu K, Lai S, Zhou G, Zhao J (2014) Relation classification via convolutional deep neural network. In: Proceedings of COLING 2014, the 25th international conference on computational linguistics: technical papers. Dublin City University and Association for Computational Linguistics, Dublin, pp 2335–2344

  36. He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: IEEE conference on computer vision and pattern recognition (CVPR). IEEE Computer Society, Los Alamitos, pp 770–778

  37. Ba JL, Kiros JR, Hinton GE (2016) Layer normalization

  38. Shen T, Jiang J, Zhou T, Pan S, Long G, Zhang C (2018) DiSAN: directional self-attention network for RNN/CNN-free language understanding. In: 32nd AAAI conference on artificial intelligence, AAAI 2018

  39. Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958

    MathSciNet  MATH  Google Scholar 

  40. Bollacker K, Evans C, Paritosh P, Sturge T, Taylor J (2008) Freebase: a collaboratively created graph database for structuring human knowledge. In: Proceedings of the 2008 ACM SIGMOD international conference on management of data, pp 1247–1250

  41. Jat S, Khandelwal S, Talukdar P (2018) Improving distantly supervised relation extraction using word and entity based attention. arXiv preprint. arXiv:1804.06987

  42. Christopoulou F, Miwa M, Ananiadou S (2021) Distantly supervised relation extraction with sentence reconstruction and knowledge base priors. In: Proceedings of the 2021 conference of the North American chapter of the Association for Computational Linguistics: human language technologies, online. Association for Computational Linguistics, pp 11–26

Download references

Acknowledgements

This work was supported in part by the National Natural Science Foundation of China under Contract 62062012 and Contract 61967003, the Natural Science Foundation of Guangxi of China under Contract 2020GXNSFAA159082 and the Innovation Project of School of Computer Science and Information Engineering, Guangxi Normal University under Contract JXXYYJSCXXM-005.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xinhua Zhu.

Ethics declarations

Conflict of interest

The authors declare that there is no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wen, H., Zhu, X. & Zhang, L. Improving distant supervision relation extraction with entity-guided enhancement feature. Neural Comput & Applic 35, 7547–7560 (2023). https://doi.org/10.1007/s00521-022-08051-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-08051-1

Keywords

Navigation