Abstract
Knowledge graphs have been widely used in numerous AI applications. In this paper, we propose an efficient knowledge graph embedding model called RotatSAGE by combining the RotatE model and the GraphSAGE model. In the proposed model the RotatE model is used to learn the embedding vectors of heterogeneous entities and relations in a knowledge graph. One problem of the RotatE model is that it only can learn from a single triplet and cannot take advantage of local information to learn embeddings. To solve this issue, we introduce the GraphSAGE model into RotatE. The GraphSAGE model can use neighbor information to improve the embedding of an entity by sampling a small and fixed number of neighbors. We also propose a sampling strategy to further eliminate redundant entity information and simplify the proposed model. In the experiments, the link prediction task is used to evaluate the performance of embedding models. The experiments on four benchmark datasets show the overall performance of RotatSAGE is higher than baseline models.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Miller, G.A.: WordNet: a lexical database for English. Commun. ACM 38(11), 39–41 (1995)
Fabian, M., Suchanek, K.G., Weikum, G.: YAGO: A core of semantic knowledge unifying WordNet and Wikipedia. In: Proceedings of the 16th International Conference on World Wide Web (2007)
Bollacker, K., et al.: Freebase: a collaboratively created graph database for structuring human knowledge. In: Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data (2008)
Wang, Q., Mao, Z., Wang, B., Guo, L.: Knowledge graph embedding: a survey of approaches and applications. IEEE Trans. Knowl. Data Eng. 29(12), 2724–2743 (2017). https://doi.org/10.1109/TKDE.2017.2754499
Bordes, A., et al.: Translating embeddings for modeling multi-relational data. In: Neural Information Processing Systems (NIPS) (2013)
Wang, Z., et al.: Knowledge graph embedding by translating on hyperplanes. Proc. AAAI Conf. Artif. Intell. 28(1) (2014)
Lin, Y., et al.: Learning entity and relation embeddings for knowledge graph completion. Proc. AAAI Conf. Artif. Intell. 29(1) (2015)
Ji, G., et al.: Knowledge graph embedding via dynamic mapping matrix. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (volume 1: Long papers) (2015)
Ji, G., et al.: Knowledge graph completion with adaptive sparse transfer matrix. Proc. AAAI Conf. Artif. Intell. 30(1) (2016)
Sun, Z., et al.: Rotate: Knowledge graph embedding by relational rotation in complex space. arXiv preprint arXiv:1902.10197 (2019)
Gori, M., Monfardini, G., Scarselli, F.: A new model for learning in graph domains. In: IEEE International Joint Conference on Neural Networks, pp. 729–734 (2005)
Scarselli, F., Gori, M., Tsoi, A.C., Hagenbuchner, M., Monfardini, G.: The graph neural network model. IEEE Trans. Neural Netw. 20(1), 61–80 (2009)
Li, Y., Tarlow, D., Brockschmidt, M., Zemel, R.: Gated graph sequence neural networks. In: International Conference on Learning Representations (ICLR) (2016)
LeCun, Y., et al.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (1998)
Wu, Z., et al.: A comprehensive survey on graph neural networks. IEEE Trans. Neural Netw. Learn. Syst. 32, 4–24 (2020)
Kipf, TN., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)
Hamilton, W.L., Ying, R., Leskovec, J.: Inductive representation learning on large graphs. arXiv preprint arXiv:1706.02216 (2017)
Wang, X., et al.: Heterogeneous graph attention network. In: The World Wide Web Conference (2019)
Schlichtkrull, M., Kipf, T.N., Bloem, P., van den Berg, R., Titov, I., Welling, M.: Modeling relational data with graph convolutional networks. In: Gangemi, A., Navigli, R., Vidal, M.-E., Hitzler, P., Troncy, R., Hollink, L. (eds.) ESWC 2018. LNCS, vol. 10843, pp. 593–607. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-93417-4_38
Yang, B., et al.: Embedding entities and relations for learning and inference in knowledge bases. arXiv preprint arXiv:1412.6575 (2014)
Cai, L., Wang, Y.W.: Kbgan: adversarial learning for knowledge graph embeddings. arXiv preprint arXiv:1711.04071 (2017)
Toutanova, K., Chen, D.: Observed versus latent features for knowledge base and text inference. In: Proceedings of the 3rd Workshop on Continuous Vector Space Models and Their Compositionality (2015)
Dettmers, T., et al.: Convolutional 2d knowledge graph embeddings. Proc. AAAI Conf. Artif. Intell. 32(1) (2018)
Kingma, D.P., Ba, J.B.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
Zhang, Y., et al.: AutoSF: searching scoring functions for knowledge graph embedding. In: 2020 IEEE 36th International Conference on Data Engineering (ICDE). IEEE (2020)
Wang, R., et al.: Knowledge graph embedding via graph attenuated attention networks. IEEE Access 8, 5212–5224 (2019)
Wang, B., et al.: Structure-augmented text representation learning for efficient knowledge graph completion. In: International World Wide Web Conference (2021)
Li, Z., et al.: Learning knowledge graph embedding with heterogeneous relation attention networks. IEEE Trans. Neural Netw. Learn. Syst. 33, 3961–3973 (2021)
Zhang, Z., et al.: Multi-scale dynamic convolutional network for knowledge graph embedding. IEEE Trans. Knowl. Data Eng. 34, 2335–2347 (2020)
Acknowledgement
This work was supported by the National Natural Science Foundation of China (Grant No. 61872107).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Ma, Y., Ding, Y., Wang, G. (2022). RotatSAGE: A Scalable Knowledge Graph Embedding Model Based on Translation Assumptions and Graph Neural Networks. In: Tan, Y., Shi, Y. (eds) Data Mining and Big Data. DMBD 2022. Communications in Computer and Information Science, vol 1744. Springer, Singapore. https://doi.org/10.1007/978-981-19-9297-1_8
Download citation
DOI: https://doi.org/10.1007/978-981-19-9297-1_8
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-19-9296-4
Online ISBN: 978-981-19-9297-1
eBook Packages: Computer ScienceComputer Science (R0)