Nothing Special   »   [go: up one dir, main page]

Skip to main content

TransI: Translating Infinite Dimensional Embeddings Based on Trend Smooth Distance

  • Conference paper
  • First Online:
Knowledge Science, Engineering and Management (KSEM 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11775))

Abstract

Knowledge representation learning aims to transform entities and relationships in a knowledge base into computable forms, so that an efficient calculation can be realized. It is of great significance to the construction, reasoning and application of knowledge base. The traditional translation-based models mainly obtain the finite dimension vector representation of entities or relationships by projecting to finite dimensional Euclidean space. These simple and effective methods greatly improve the efficiency and accuracy of knowledge representation. However, they ignore a fact that the semantic space develops and grows forever with the passage of time. Finite dimensional Euclidean space is not enough in capacity for vectorizing infinitely growing semantic space. Time is moving forward forever, so knowledge base would expand infinitely with time. This determines that the vector representation of entities and relationships should support infinite capacity. We fill the gap by putting forward TransI (Translating Infinite Dimensional Embeddings) model, which extends knowledge representation learning from finite dimensions to infinite dimensions. It is trained by Trend Smooth Distance based on the idea of continuous infinite dimension vector representation. The Training Efficiency of TransI model is obviously better than TransE under the same setting, and its effect of Dimension Reduction Clustering is more obvious.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bollacker, K., Evans, C., Paritosh, P., Sturge, T., Taylor, J.: Freebase: a collaboratively created graph database for structuring human knowledge. In: Proceedings of the 2008 ACM SIGMOD International Conference on Management of Data, pp. 1247–1250. ACM (2008)

    Google Scholar 

  2. Bordes, A., Usunier, N., Garcia-Duran, A., Weston, J., Yakhnenko, O.: Translating embeddings for modeling multi-relational data. In: Advances in Neural Information Processing Systems, pp. 2787–2795 (2013)

    Google Scholar 

  3. Fabian, M., Gjergji, K., Gerhard, W., et al.: Yago: a core of semantic knowledge unifying wordnet and wikipedia. In: 16th International World Wide Web Conference, WWW, pp. 697–706 (2007)

    Google Scholar 

  4. Fan, M., Zhou, Q., Chang, E., Zheng, T.F.: Transition-based knowledge graph embedding with relational mapping properties. In: Proceedings of the 28th Pacific Asia Conference on Language, Information and Computing (2014)

    Google Scholar 

  5. Guo, S., Wang, Q., Wang, B., Wang, L., Guo, L.: Semantically smooth knowledge graph embedding. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), vol. 1, pp. 84–94 (2015)

    Google Scholar 

  6. He, S., Liu, K., Ji, G., Zhao, J.: Learning to represent knowledge graphs with Gaussian embedding. In: Proceedings of the 24th ACM International on Conference on Information and Knowledge Management, pp. 623–632. ACM (2015)

    Google Scholar 

  7. Ji, G., He, S., Xu, L., Liu, K., Zhao, J.: Knowledge graph embedding via dynamic mapping matrix. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), vol. 1, pp. 687–696 (2015)

    Google Scholar 

  8. Ji, G., Liu, K., He, S., Zhao, J.: Knowledge graph completion with adaptive sparse transfer matrix. In: Thirtieth AAAI Conference on Artificial Intelligence (2016)

    Google Scholar 

  9. Lin, Y., Liu, Z., Sun, M., Liu, Y., Zhu, X.: Learning entity and relation embeddings for knowledge graph completion. In: Twenty-Ninth AAAI Conference on Artificial Intelligence (2015)

    Google Scholar 

  10. Ma, S., Ding, J., Jia, W., Wang, K., Guo, M.: TransT: type-based multiple embedding representations for knowledge graph completion. In: Ceci, M., Hollmén, J., Todorovski, L., Vens, C., Džeroski, S. (eds.) ECML PKDD 2017. LNCS (LNAI), vol. 10534, pp. 717–733. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-71249-9_43

    Chapter  Google Scholar 

  11. Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013)

  12. Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., Dean, J.: Distributed representations of words and phrases and their compositionality. In: Advances in Neural Information Processing Systems, pp. 3111–3119 (2013)

    Google Scholar 

  13. Miller, G.: WordNet: a lexical database for English. Commun. ACM 38, 39–41 (1995)

    Article  Google Scholar 

  14. Wang, Q., Wang, B., Guo, L.: Knowledge base completion using embeddings and rules. In: Twenty-Fourth International Joint Conference on Artificial Intelligence (2015)

    Google Scholar 

  15. Wang, Z., Zhang, J., Feng, J., Chen, Z.: Knowledge graph embedding by translating on hyperplanes. In: Twenty-Eighth AAAI Conference on Artificial Intelligence (2014)

    Google Scholar 

  16. Xiao, H., Huang, M., Zhu, X.: TransG: a generative model for knowledge graph embedding. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), vol. 1, pp. 2316–2325 (2016)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xin Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Guo, X., Gao, N., Wang, L., Wang, X. (2019). TransI: Translating Infinite Dimensional Embeddings Based on Trend Smooth Distance. In: Douligeris, C., Karagiannis, D., Apostolou, D. (eds) Knowledge Science, Engineering and Management. KSEM 2019. Lecture Notes in Computer Science(), vol 11775. Springer, Cham. https://doi.org/10.1007/978-3-030-29551-6_46

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-29551-6_46

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-29550-9

  • Online ISBN: 978-3-030-29551-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics