Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3384544.3384582acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicscaConference Proceedingsconference-collections
research-article

BGSGA: Combining Bi-GRU and Syntactic Graph Attention for Improving Distant Supervision Relation Extraction

Published: 17 April 2020 Publication History

Abstract

Distant supervision Relation Extraction(RE) aligns entities in a Knowledge Base (KB) with text to automatically construct large-labeled corpus, which alleviates the need for manual annotation in traditional RE. However, most existing models can't take full advantage of the syntactic structure information of each word in the dependency tree. In this paper, we propose BGSGA, a novel distant supervision RE model, to capture both semantic information and syntactic structure in the bag. BGSGA constructs the syntactic graph by combining the dependency trees of the sentences in the bag and then employs a syntactic structure attention mechanism to update the word embedding obtained from Bi-GRU. The syntactic structure attention mechanism captures the cross-sentence information with different weight by implementing the special attention operation between the target word and its neighbors of first-order and second-order in the graph. The experiments show that BGSGA outperforms our baseline models on benchmark datasets.

References

[1]
Mike, M., Steven, B., Rion, S. and Daniel, J. 2009. Distant supervision for relation extraction without labeled data. In Proceedings of the 47th Annual Meeting of the Association for Computational Linguistics and the 4th International Joint Conference on Natural Language Processing of the AFNLP (Singapore August 02-07 2009). The Association for Computer Linguistics, 1003--1011. DOI=https://doi.org/10.3115/1690219.1690287.
[2]
Riedel, S., Yao, L. and McCallum, A. 2010. Modeling relations and their mentions without labeled text. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases (Barcelona, Spain, September 20-24, 2010). Springer, Berlin, Heidelberg, 148--163. DOI=https://doi.org/10.1007/978-3-642-15939-8.
[3]
Hoffmann, R., Zhang, C., Ling, X., Zettlemoyer, L. and Weld, D. S. 2011. Knowledge-based weak supervision for information extraction of overlapping relations. In Proceedings of The 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies (Portland, Oregon, USA, June, 19--24, 2011). The Association for Computer Linguistics, 541--550.
[4]
Surdeanu, M., Tibshirani, J., Nallapati, R. and Manning, C. D. 2012. Multi-instance multi-label learning for relation extraction. In Proceedings of the 2012 joint conference on empirical methods in natural language processing and computational natural language learning (Jeju Island, Korea, July 12-14, 2012). Association for Computational Linguistics, 455--465.
[5]
Zeng, D., Liu, K., Chen, Y. and Zhao, J. 2015. Distant supervision for relation extraction via piecewise convolutional neural networks. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (Lisbon, Portugal, September 17-21, 2015). The Association for Computational Linguistics, 1753--1762. DOI= https://doi.org/10.18653/v1/d15-1203.
[6]
Lin, Y., Shen, S., Liu, Z., Luan, H. and Sun, M. 2016. Neural relation extraction with selective attention over instances. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Berlin, Germany, August 07-12, 2016). The Association for Computer Linguistics. DOI= https://doi.org/10.18653/v1/p16-1200.
[7]
Shikhar, V., Rishabh, J., Sai, S. P., Chiranjib, B. and Partha, T. 2018. RESIDE: improving distantly-supervised neural relation extraction using side information. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium (Brussels, Belgium, October 31 - November 04, 2018). Association for Computational Linguistics, 1257--1266. DOI=https://doi.org/10.18653/v1/d18-1157.
[8]
Ji, G., Liu, K., He, S. and Zhao, J. 2017. Distant supervision for relation extraction with sentence-level attention and entity descriptions. In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence (San Francisco, California, USA, February 04-09, 2017). the Association for the Advance of Artificial Intelligence, 3060--3066.
[9]
He, Z., Chen, W., Li, Z., Zhang, M., Zhang, W. and Zhang, M. 2018. SEE: Syntax-aware entity embedding for neural relation extraction. In Proceedings of Thirty-Second AAAI Conference on Artificial Intelligence (New Orleans, Louisiana, USA, February 02-07, 2018). AAAI Press, 5795--5802.
[10]
Petar, V., Guillem, C., Arantxa, C., Adriana, R. and Pietro, L. 2018. Graph Attention Networks. In 6th International Conference on Learning Representations (Vancouver, BC, Canada, April 30 - May 03, 2018). OpenReview.net.
[11]
Jiang, H., Cui, L., Xu, Z., Yang, D., Chen, J., Li, C. and Wang, W. 2019. Relation extraction using supervision from topic knowledge of relation labels. In Proceedings of the 28th International Joint Conference on Artificial Intelligence (Macao, China, August 10-16, 2019). ijcai.org, 5024--5030. DOI= https://doi.org/10.24963/ijcai.2019/698.
[12]
Jeffrey, P., Richard, S. and Christopher, D. 2014. Glove: Global vectors for word representation. In Proceedings of the2014 Conference on Empirical Methods in Natural Language Processing (Doha, Qatar, October 25-29, 2014), ACL, 1532--1543. DOI= https://doi.org/10.3115/v1/d14-1162.
[13]
Daojian, Z., Kang, L., Siwei, L., Guangyou, Z. and Jun, Z. 2014. Relation classification via convolutional deep neural network. In COLING 2014, 25th International Conference on Computational Linguistics, Proceedings of the Conference: Technical Papers (Dublin, Ireland, August 23-29, 2014). The Association for Computer Linguistics, 2335--2344.
[14]
Christopher, D., Manning, M., John, B., Jenny, R. F., Steven, B. and David M. The stanford corenlp natural language processing toolkit. In Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Baltimore, MD, USA, June 22-27, 2014). The Association for Computer Linguistics, 55--60. DOI=https://doi.org/10.3115/v1/p14-5010.
[15]
Jat, S., Khandelwal, S. and Talukdar, P. 2018. Improving distantly supervised relation extraction using word and entity based attention. arXiv preprint arXiv:1804.06987.
[16]
He, D., Zhang, H., Hao, W., Zhang, R., Chen, G., Jin, D. and Cheng, K. 2017. Distant supervised relation extraction via long short term memory networks with sentence embedding. J. Intell Data Anal. 21, 5. (May. 2017), 1213--1231. DOI=https://doi.org/10.3233/IDA-163148.
[17]
Feng, J., Huang, M., Zhao, L., Yang, Y. and Zhu, XY. Reinforcement learning for relation classification from noisy data. In Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence (New Orleans, Louisiana, USA, February 02-07, 2018). AAAI Press, 5779--5786.
[18]
Zeng, X., He, S., Liu, K. and Zhao, J. 2018. Large scaled relation extraction with reinforcement learning. In Thirty-Second AAAI Conference on Artificial Intelligence (New Orleans, Louisiana, USA, February 02-07, 2018). AAAI Press, 5658--5665.
[19]
Ye, H., Chao, W., Luo, Z., et al. 2017. Jointly Extracting Relations with Class Ties via Effective Deep Ranking. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Vancouver, Canada, July 30 -August 04, 2017). Association for Computational Linguistics, 1810--1820. DOI=https://doi.org/10.18653/v1/p17-1166.
[20]
Feng, X., Guo, J., Qin, B., Liu, T. and Liu, Y. 2017. Effective Deep Memory Networks for Distant Supervised Relation Extraction. In Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence (Melbourne, Australia, August 19-25, 2017). ijcai.org, 4002--4008. DOI= https://doi.org/10.24963/ijcai.2017/559.
[21]
Han, X., Yu, P., Liu, Z., Sun, M. and Li, P. 2018. Hierarchical relation extraction with coarse-to-fine grained attention. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing (Brussels, Belgium, October 31 - November 04, 2018). Association for Computational Linguistics, 2236--2245. DOI= https://doi.org/10.18653/v1/d18-1247.
[22]
Yuan, Y., Liu, L., Tang, S., Zhang, Z., Zhuang, Y., Pu, S., Wu, F. and Ren, X. 2019. Cross-relation cross-bag attention for distantly-supervised relation extraction. In Proceedings of the AAAI Conference on Artificial Intelligence (Honolulu, Hawaii, USA, January 27 - February 01, 2019). AAAI Press, 419--426. DOI= https://doi.org/10.1609/aaai.v33i01.3301419.
[23]
Thomas, N., and Max, W. 2017. Semi-Supervised Classification with Graph Convolutional Networks. 5th International Conference on Learning Representations (Toulon, France, April 24-26, 2017). OpenReview.net.
[24]
J. Zhang, X. Shi, J. Xie, H. Ma, I. King, and D.-Y. Yeung. 2018. Gaan: Gated attention networks for learning on large and spatiotemporal graphs. In Proceedings of the Uncertainty in Artificial Intelligence (Monterey, California, USA, August 06-10, 2018). AUAI Press, 339--349.
[25]
Zhouxia W, Tianshui C, Jimmy S. J. R, Weihao Y, Hui C and Liang L. 2018. Deep Reasoning with Knowledge Graph for Social Relationship Understanding. In Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence (Stockholm, Sweden, July 13-19, 2018). ijcai.org, 1021--1028. DOI= https://doi.org/10.24963/ijcai.2018/142.
[26]
Z. W, Q. L, X. L, and Y. Z. 2018. Cross-lingual knowledge graph alignment via graph convolutional networks. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing (Brussels, Belgium, October 31 - November 4, 2018). Association for Computational Linguistics, 349--357. DOI= https://doi.org/10.18653/v1/d18-1032.
[27]
Shu W, Yuyuan T, Yanqiao Z, Liang W, Xing X, and Tieniu T. 2019. Session-Based Recommendation with Graph Neural Networks. The Thirty-Third AAAI Conference on Artificial Intelligence (Honolulu, Hawaii, USA, January 27 - February 1, 2019). AAAI Press, 346--353. DOI= https://doi.org/10.1609/aaai.v33i01.3301346.
[28]
Linfeng, S., Yue, Z., Zhiguo, W. and Daniel, G. 2018. N-ary Relation Extraction using Graph-State LSTM. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing (Brussels, Belgium, October 31 - November 4, 2018). Association for Computational Linguistics, 2226--2235. DOI= https://doi.org/10.18653/v1/d18-1246.
[29]
Liang Y., Chengsheng M. and Yuan, L. 2019. Graph Convolutional Networks for Text Classification. The Thirty-Third AAAI Conference on Artificial Intelligence (Honolulu, Hawaii, USA, January 27 - February 1, 2019). AAAI Press, 7370--7377. DOI= https://doi.org/10.1609/aaai.v33i01.33017370.
[30]
Sunil, K. S., Fenia, C., Makoto, M and Sophia, A. 2019. Inter-sentence Relation Extraction with Document-level Graph Convolutional Neural Network. In Proceedings of the 57th Conference of the Association for Computational Linguistics (Florence, Italy, July 28- August 2, 2019). Association for Computational Linguistics, 4309--4316. DOI= https://doi.org/10.18653/v1/p19-1423.
[31]
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., ... and Polosukhin, I. 2017. Attention is all you need. In Advances in neural information processing systems (CA, USA, Dec 4-9, 2017). 5998--6008.

Index Terms

  1. BGSGA: Combining Bi-GRU and Syntactic Graph Attention for Improving Distant Supervision Relation Extraction

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    ICSCA '20: Proceedings of the 2020 9th International Conference on Software and Computer Applications
    February 2020
    382 pages
    ISBN:9781450376655
    DOI:10.1145/3384544
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 17 April 2020

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Distant supervision relation extraction
    2. Graph neural network
    3. Information extraction
    4. Relation extraction

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    ICSCA 2020

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 167
      Total Downloads
    • Downloads (Last 12 months)5
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 28 Sep 2024

    Other Metrics

    Citations

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media