Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3460210.3493574acmconferencesArticle/Chapter ViewAbstractPublication Pagesk-capConference Proceedingsconference-collections
research-article

GATES: Using Graph Attention Networks for Entity Summarization

Published: 02 December 2021 Publication History

Abstract

The sheer size of modern knowledge graphs has led to increased attention being paid to the entity summarization task. Given a knowledge graph T and an entity e found therein, solutions to entity summarization select a subset of the triples from T which summarize e's concise bound description. Presently, the best performing approaches rely on sequence-to-sequence models to generate entity summaries and use little to none of the structure information of T during the summarization process. We hypothesize that this structure information can be exploited to compute better summaries. To verify our hypothesis, we propose GATES, a new entity summarization approach that combines topological information and knowledge graph embeddings to encode triples. The topological information is encoded by means of a Graph Attention Network. Furthermore, ensemble learning is applied to boost the performance of triple scoring. We evaluate GATES on the DBpedia and LMDB datasets from ESBM (version 1.2), as well as on the FACES datasets. Our results show that GATES outperforms the state-of-the-art approaches on 4 of 6 configuration settings and reaches up to 0.574 F-measure. Pertaining to resulted summaries quality, GATES still underperforms the state of the arts as it obtains the highest score only on 1 of 6 configuration settings at 0.697 NDCG score. An open-source implementation of our approach and of the code necessary to rerun our experiments are available at https://github.com/dice-group/GATES.

References

[1]
Mehmet Aydar, Serkan Ayvaz, and Austin Melton. 2015. Automatic Weight Generation and Class Predicate Stability in RDF Summary Graphs. In Proceedings of the 4th InternationalWorkshop on Intelligent Exploration of Semantic Data (IESD 2015) co-located with the 14th International Semantic Web Conference (ISWC 2015), Bethlehem, Pennsylvania, USA, October 12, 2015 (CEUR Workshop Proceedings), Vol. 1472. CEUR-WS.org.
[2]
Claudia Beleites and Reiner Salzer. 2008. Assessing and improving the stability of chemometric models in small sample size situations. Analytical and bioanalytical chemistry 390, 5 (2008), 1261--1271.
[3]
Yoshua Bengio, Réjean Ducharme, Pascal Vincent, and Christian Janvin. 2003. A Neural Probabilistic Language Model. J. Mach. Learn. Res. 3 (2003), 1137--1155. http://jmlr.org/papers/v3/bengio03a.html
[4]
Antoine Bordes, Nicolas Usunier, Alberto Garcia-Duran, Jason Weston, and Oksana Yakhnenko. 2013. Translating embeddings for modeling multi-relational data. In Neural Information Processing Systems (NIPS). 1--9.
[5]
Gong Cheng, Thanh Tran, and Yuzhong Qu. 2011. Relin: relatedness and informativeness-based centrality for entity summarization. In International Semantic Web Conference. Springer, 114--129.
[6]
Caglar Demir and Axel-Cyrille Ngonga Ngomo. 2021. Convolutional Complex Knowledge Graph Embeddings. In Eighteenth Extended Semantic Web Conference - Research Track. https://openreview.net/forum?id=6T45--4TFqaX
[7]
Timofey Ermilov, Diego Moussallem, Ricardo Usbeck, and Axel-Cyrille Ngonga Ngomo. 2017. GENESIS: a generic RDF data access interface. In Proceedings of the International Conference on Web Intelligence. 125--131.
[8]
Alex Graves. 2013. Generating Sequences With Recurrent Neural Networks. CoRR abs/1308.0850 (2013). http://arxiv.org/abs/1308.0850
[9]
Kalpa Gunaratna, Krishnaprasad Thirunarayan, and Amit Sheth. 2015. FACES: diversity-aware entity summarization using incremental hierarchical conceptual clustering. In Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence. AAAI Press, 116--122.
[10]
Kalpa Gunaratna, Krishnaprasad Thirunarayan, Amit P. Sheth, and Gong Cheng. 2016. Gleaning Types for Literals in RDF Triples with Application to Entity Summarization. In The Semantic Web. Latest Advances and New Domains - 13th International Conference, ESWC 2016, Heraklion, Crete, Greece, May 29 - June 2, 2016, Proceedings (Lecture Notes in Computer Science), Vol. 9678. Springer, 85--100. https://doi.org/10.1007/978--3--319--34129--3_6
[11]
Jiafeng Guo, Yixing Fan, Qingyao Ai, andW. Bruce Croft. 2016. A Deep Relevance Matching Model for Ad-hoc Retrieval. In Proceedings of the 25th ACM International Conference on Information and Knowledge Management, CIKM 2016, Indianapolis, IN, USA, October 24--28, 2016, Snehasis Mukhopadhyay, ChengXiang Zhai, Elisa Bertino, Fabio Crestani, Javed Mostafa, Jie Tang, Luo Si, Xiaofang Zhou, Yi Chang, Yunyao Li, and Parikshit Sondhi (Eds.). ACM, 55--64.
[12]
Armand Joulin, Edouard Grave, Piotr Bojanowski, Matthijs Douze, Hervé Jégou, and Tomás Mikolov. 2016. FastText.zip: Compressing text classification models. CoRR abs/1612.03651 (2016). http://arxiv.org/abs/1612.03651
[13]
Thomas N. Kipf and Max Welling. 2017. Semi-Supervised Classification with Graph Convolutional Networks. In 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, April 24--26, 2017, Conference Track Proceedings. OpenReview.net. https://openreview.net/forum?id=SJU4ayYgl
[14]
Xiangyuan Kong, Weiwei Xing, Xiang Wei, Peng Bao, Jian Zhang, and Wei Lu. 2020. STGAT: Spatial-Temporal Graph Attention Networks for Traffic Flow Forecasting. IEEE Access 8 (2020), 134363--134372.
[15]
Qingxia Liu, Yue Chen, Gong Cheng, Evgeny Kharlamov, Junyou Li, and Yuzhong Qu. 2020. Entity Summarization with User Feedback. In The Semantic Web - 17th International Conference, ESWC 2020, Heraklion, Crete, Greece, May 31-June 4, 2020, Proceedings (Lecture Notes in Computer Science), Vol. 12123. Springer, 376--392.
[16]
Qingxia Liu, Gong Cheng, Kalpa Gunaratna, and Yuzhong Qu. 2020. ESBM: an entity summarization benchmark. In European SemanticWeb Conference. Springer, 548--564.
[17]
Qingxia Liu, Gong Cheng, Kalpa Gunaratna, and Yuzhong Qu. 2021. Entity summarization: State of the art and future challenges. J. Web Semant. 69 (2021), 100647. https://doi.org/10.1016/j.websem.2021.100647
[18]
Qingxia Liu, Gong Cheng, and Yuzhong Qu. 2020. DeepLENS: Deep Learning for Entity Summarization. In Proceedings of the Workshop on Deep Learning for Knowledge Graphs (DL4KG2020) co-located with the 17th Extended Semantic Web Conference 2020 (ESWC 2020), Heraklion, Greece, June 02, 2020 - moved online (CEUR Workshop Proceedings), Mehwish Alam, Davide Buscaldi, Michael Cochez, Francesco Osborne, Diego Reforgiato Recupero, and Harald Sack (Eds.), Vol. 2635. CEUR-WS.org. http://ceur-ws.org/Vol-2635/paper2.pdf
[19]
Yoëlle S Maarek. 1990. An incremental conceptual clustering algorithm that reduces input-ordering bias. In Advances in Artificial Intelligence. Springer, 129-- 144.
[20]
Lawrence Page, Sergey Brin, Rajeev Motwani, and Terry Winograd. 1999. The PageRank citation ranking: Bringing order to the web. Technical Report. Stanford InfoLab.
[21]
Jeffrey Pennington, Richard Socher, and Christopher Manning. 2014. GloVe: Global Vectors for Word Representation. In Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). Association for Computational Linguistics, Doha, Qatar, 1532--1543. https://doi.org/10.3115/v1/ D14--1162
[22]
Bernard Rosner, Robert J Glynn, and Mei-Ling T Lee. 2006. The Wilcoxon signed rank test for paired comparisons of clustered data. Biometrics 62, 1 (2006), 185-- 192.
[23]
Marcin Sydow, Mariusz Pikua, and Ralf Schenkel. 2010. DIVERSUM: Towards diversified summarisation of entities in knowledge graphs. In 2010 IEEE 26th International Conference on Data Engineering Workshops (ICDEW 2010). IEEE, 221--226.
[24]
Andreas Thalhammer, Nelia Lasierra, and Achim Rettinger. 2016. Linksum: using link analysis to summarize entity data. In International Conference on Web Engineering. Springer, 244--261.
[25]
Théo Trouillon, Johannes Welbl, Sebastian Riedel, Éric Gaussier, and Guillaume Bouchard. 2016. Complex embeddings for simple link prediction. In International Conference on Machine Learning. PMLR, 2071--2080.
[26]
Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Liò, and Yoshua Bengio. 2018. Graph Attention Networks. International Conference on Learning Representations (2018). https://openreview.net/forum?id= rJXMpikCZ accepted as poster.
[27]
Ziming Wang, Jun Chen, and Haopeng Chen. 2021. EGAT: Edge-Featured Graph Attention Network. In Artificial Neural Networks and Machine Learning -- ICANN 2021. Springer International Publishing, Cham, 253--264.
[28]
Zhen Wang, Jianwen Zhang, Jianlin Feng, and Zheng Chen. 2014. Knowledge graph embedding by translating on hyperplanes. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 28.
[29]
Dongjun Wei, Shiyuan Gao, Yaxin Liu, Zhibing Liu, Longtao Huang, and Songlin Hu. 2018. MPSUM: Predicate-Based Matching for RDF Triples with Application to LDA. In EYRE@CIKM.
[30]
DongjunWei, Yaxin Liu, Fuqing Zhu, Liangjun Zang,Wei Zhou, Jizhong Han, and Songlin Hu. 2019. ESA: Entity Summarization with Attention. In EYRE@CIKM.
[31]
Dongjun Wei, Yaxin Liu, Fuqing Zhu, Liangjun Zang, Wei Zhou, Yijun Lu, and Songlin Hu. 2020. AutoSUM: Automating Feature Extraction and Multi-user Preference Simulation for Entity Summarization. In Pacific-Asia Conference on Knowledge Discovery and Data Mining. Springer, 580--592.
[32]
Bishan Yang, Wen-tau Yih, Xiaodong He, Jianfeng Gao, and Li Deng. 2015. Embedding Entities and Relations for Learning and Inference in Knowledge Bases. In 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7--9, 2015, Conference Track Proceedings, Yoshua Bengio and Yann LeCun (Eds.). http://arxiv.org/abs/1412.6575
[33]
Michihiro Yasunaga, Rui Zhang, Kshitijh Meelu, Ayush Pareek, Krishnan Srinivasan, and Dragomir Radev. 2017. Graph-based Neural Multi-Document Summarization. In Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017). Association for Computational Linguistics, Vancouver, Canada, 452--462. https://www.aclweb.org/anthology/K17--1045
[34]
Dengyong Zhou, Jiayuan Huang, and Bernhard Schölkopf. 2006. Learning with Hypergraphs: Clustering, Classification, and Embedding. In Advances in Neural Information Processing Systems 19, Proceedings of the Twentieth Annual Conference on Neural Information Processing Systems, Vancouver, British Columbia, Canada, December 4--7, 2006, Bernhard Schölkopf, John C. Platt, and Thomas Hofmann (Eds.). MIT Press, 1601--1608.

Cited By

View all

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
K-CAP '21: Proceedings of the 11th Knowledge Capture Conference
December 2021
300 pages
ISBN:9781450384575
DOI:10.1145/3460210
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 December 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. entity summarization
  2. graph attention network
  3. knowledge graph embeddings
  4. text embeddings

Qualifiers

  • Research-article

Conference

K-CAP '21
Sponsor:
K-CAP '21: Knowledge Capture Conference
December 2 - 3, 2021
Virtual Event, USA

Acceptance Rates

Overall Acceptance Rate 55 of 198 submissions, 28%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 217
    Total Downloads
  • Downloads (Last 12 months)60
  • Downloads (Last 6 weeks)11
Reflects downloads up to 18 Nov 2024

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media