Nothing Special   »   [go: up one dir, main page]

RotateCT: Knowledge Graph Embedding by Rotation and Coordinate Transformation in Complex Space

Yao Dong, Lei Wang, Ji Xiang, Xiaobo Guo, Yuqiang Xie


Abstract
Knowledge graph embedding, which aims to learn representations of entities and relations in knowledge graphs, finds applications in various downstream tasks. The key to success of knowledge graph embedding models are the ability to model relation patterns including symmetry/antisymmetry, inversion, commutative composition and non-commutative composition. Although existing methods fail in modeling the non-commutative composition patterns, several approaches support this pattern by modeling beyond Euclidean space and complex space. Nevertheless, expanding to complicated spaces such as quaternion can easily lead to a substantial increase in the amount of parameters, which greatly reduces the computational efficiency. In this paper, we propose a new knowledge graph embedding method called RotateCT, which first transforms the coordinates of each entity, and then represents each relation as a rotation from head entity to tail entity in complex space. By design, RotateCT can infer the non-commutative composition patterns and improve the computational efficiency. Experiments on multiple datasets empirically show that RotateCT outperforms most state-of-the-art methods on link prediction and path query answering.
Anthology ID:
2022.coling-1.436
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
4918–4932
Language:
URL:
https://aclanthology.org/2022.coling-1.436
DOI:
Bibkey:
Cite (ACL):
Yao Dong, Lei Wang, Ji Xiang, Xiaobo Guo, and Yuqiang Xie. 2022. RotateCT: Knowledge Graph Embedding by Rotation and Coordinate Transformation in Complex Space. In Proceedings of the 29th International Conference on Computational Linguistics, pages 4918–4932, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
RotateCT: Knowledge Graph Embedding by Rotation and Coordinate Transformation in Complex Space (Dong et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.436.pdf
Data
FB15k-237