Modeling Tree-like Heterophily on Symmetric Matrix Manifolds
<p>Illustration of homophily (<b>left</b>) and heterophily (<b>right</b>) derived from tree-like graphs. Colors denote node categories.</p> "> Figure 2
<p>Illustration of graphs embedded in a continuous symmetric space with both flat and tree-like substructures.</p> "> Figure 3
<p>Schematic of RGCN.</p> "> Figure 4
<p>Classification results under different dimension settings.</p> "> Figure 5
<p>Classification results under different propagation layer settings.</p> ">
Abstract
:1. Introduction
- Introduction of a graph convolutional neural network framework operating on the Riemannian symmetric positive-definite matrix manifold, facilitating graph embedding with reduced distortion and enhanced expressiveness.
- Development of a comprehensive scheme of information propagation on the symmetric positive-definite matrix manifold through the utilization of pullback techniques for the generalization of various Riemannian metrics.
- Extensive experimental evaluations showing the significant performance enhancements achieved by our proposed RGCN model compared to existing Euclidean and hyperbolic baselines in the context of semi-supervised node classification tasks.
2. Related Work
2.1. Graph Neural Networks
2.2. Riemannian Manifold of Symmetric Positive-Definite Matrices
3. Preliminaries and Problem Definition
3.1. Riemannian Manifold
3.2. Geometry of SPD Manifold
3.3. Problem Definition
4. SPD Graph Convolutional Networks
4.1. Mapping from Euclidean to SPD Spaces
4.2. Feature Transformation
4.3. Neighborhood Aggregation
4.4. RGCN Architecture
5. Experiments
5.1. Experimental Setup
5.2. Experimental Results
5.3. Analysis and Discussion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Adcock, A.B.; Sullivan, B.D.; Mahoney, M.W. Tree-like structure in large social and information networks. In Proceedings of the 2013 IEEE 13th International Conference on Data Mining, Dallas, TX, USA, 7–10 December 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 1–10. [Google Scholar]
- Khrennikov, A.; Oleschko, K. An ultrametric random walk model for disease spread taking into account social clustering of the population. Entropy 2020, 22, 931. [Google Scholar] [CrossRef]
- Hu, X.; Chen, H.; Chen, H.; Li, X.; Zhang, J.; Liu, S. Mining Mobile Network Fraudsters with Augmented Graph Neural Networks. Entropy 2023, 25, 150. [Google Scholar] [CrossRef] [PubMed]
- Zhang, X.; Zhou, Y.; Wang, J.; Lu, X. Personal interest attention graph neural networks for session-based recommendation. Entropy 2021, 23, 1500. [Google Scholar] [CrossRef] [PubMed]
- Khrennikov, A.; Oleschko, K.; Correa Lopez, M.d.J. Modeling fluid’s dynamics with master equations in ultrametric spaces representing the treelike structure of capillary networks. Entropy 2016, 18, 249. [Google Scholar] [CrossRef]
- Abu-Ata, M.; Dragan, F.F. Metric tree-like structures in real-world networks: An empirical study. Networks 2016, 67, 49–68. [Google Scholar] [CrossRef]
- Xi, Y.; Cui, X. Identifying Influential Nodes in Complex Networks Based on Information Entropy and Relationship Strength. Entropy 2023, 25, 754. [Google Scholar] [CrossRef] [PubMed]
- Pennec, X.; Fillard, P.; Ayache, N. A Riemannian framework for tensor computing. Int. J. Comput. Vis. 2006, 66, 41–66. [Google Scholar] [CrossRef]
- Arsigny, V.; Fillard, P.; Pennec, X.; Ayache, N. Fast and Simple Computations on Tensors with Log-Euclidean Metrics. Ph.D. Thesis, Inria, Paris, France, 2005. [Google Scholar]
- Arsigny, V.; Fillard, P.; Pennec, X.; Ayache, N. Geometric means in a novel vector space structure on symmetric positive-definite matrices. SIAM J. Matrix Anal. Appl. 2007, 29, 328–347. [Google Scholar] [CrossRef]
- Lin, Z. Riemannian geometry of symmetric positive definite matrices via Cholesky decomposition. SIAM J. Matrix Anal. Appl. 2019, 40, 1353–1370. [Google Scholar] [CrossRef]
- Kipf, T.N.; Welling, M. Semi-supervised classification with graph convolutional networks. In Proceedings of the International Conference on Learning Representation, Toulon, France, 24–26 April 2017. [Google Scholar]
- Gilmer, J.; Schoenholz, S.S.; Riley, P.F.; Vinyals, O.; Dahl, G.E. Neural message passing for quantum chemistry. In Proceedings of the International Conference on Machine Learning, PMLR, Sydney, Australia, 6–11 August 2017; pp. 1263–1272. [Google Scholar]
- Zhang, M.; Chen, Y. Link prediction based on graph neural networks. In Proceedings of the Advances in Neural Information Processing Systems, Montreal, QC, Canada, 3–8 December 2018; Volume 31. [Google Scholar]
- Xu, K.; Hu, W.; Leskovec, J.; Jegelka, S. How Powerful are Graph Neural Networks? In Proceedings of the International Conference on Learning Representations, Vancouver, BC, Canada, 30 April–3 May 2018. [Google Scholar]
- Bruna, J.; Zaremba, W.; Szlam, A.; Lecun, Y. Spectral networks and locally connected networks on graphs. In Proceedings of the International Conference on Learning Representations (ICLR2014), CBLS, Banff, AB, Canada, 14–16 April 2014. [Google Scholar]
- Defferrard, M.; Bresson, X.; Vandergheynst, P. Convolutional neural networks on graphs with fast localized spectral filtering. In Proceedings of the Advances in Neural Information Processing Systems, Barcelona, Spain, 5–10 December 2016; Volume 29. [Google Scholar]
- Pei, H.; Wei, B.; Chang, K.C.C.; Lei, Y.; Yang, B. Geom-GCN: Geometric Graph Convolutional Networks. In Proceedings of the International Conference on Learning Representations, New Orleans, LA, USA, 6–9 May 2019. [Google Scholar]
- Hammond, D.K.; Vandergheynst, P.; Gribonval, R. Wavelets on graphs via spectral graph theory. Appl. Comput. Harmon. Anal. 2011, 30, 129–150. [Google Scholar] [CrossRef]
- Zhang, M.; Cui, Z.; Neumann, M.; Chen, Y. An end-to-end deep learning architecture for graph classification. In Proceedings of the AAAI Conference on Artificial Intelligence, New Orleans, LA, USA, 2–7 February 2018; Volume 32. [Google Scholar]
- Veličković, P.; Cucurull, G.; Casanova, A.; Romero, A.; Lio, P.; Bengio, Y. Graph attention networks. In Proceedings of the International Conference on Learning Representations, Vancouver, BC, Canada, 30 April–3 May 2018. [Google Scholar]
- Zhang, Z.; Cui, P.; Zhu, W. Deep learning on graphs: A survey. IEEE Trans. Knowl. Data Eng. 2020, 34, 249–270. [Google Scholar] [CrossRef]
- Peng, W.; Varanka, T.; Mostafa, A.; Shi, H.; Zhao, G. Hyperbolic deep neural networks: A survey. IEEE Trans. Pattern Anal. Mach. Intell. 2021, 44, 10023–10044. [Google Scholar] [CrossRef] [PubMed]
- Papadopoulos, F.; Kitsak, M.; Serrano, M.Á.; Boguná, M.; Krioukov, D. Popularity versus similarity in growing networks. Nature 2012, 489, 537–540. [Google Scholar] [CrossRef]
- Krioukov, D.; Papadopoulos, F.; Kitsak, M.; Vahdat, A.; Boguná, M. Hyperbolic geometry of complex networks. Phys. Rev. E 2010, 82, 036106. [Google Scholar] [CrossRef] [PubMed]
- Nickel, M.; Kiela, D. Poincaré embeddings for learning hierarchical representations. In Proceedings of the Advances in Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017; Volume 30. [Google Scholar]
- Nickel, M.; Kiela, D. Learning continuous hierarchies in the lorentz model of hyperbolic geometry. In Proceedings of the International Conference on Machine Learning, PMLR, Stockholm, Sweden, 10–15 July 2018; pp. 3779–3788. [Google Scholar]
- Sun, Z.; Chen, M.; Hu, W.; Wang, C.; Dai, J.; Zhang, W. Knowledge Association with Hyperbolic Knowledge Graph Embeddings. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), Online, 16–20 November 2020; pp. 5704–5716. [Google Scholar]
- Yang, M.; Zhou, M.; Pan, L.; King, I. κhgcn: Tree-likeness modeling via continuous and discrete curvature learning. In Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, Long Beach, CA, USA, 6–10 August 2023; pp. 2965–2977. [Google Scholar]
- Khrulkov, V.; Mirvakhabova, L.; Ustinova, E.; Oseledets, I.; Lempitsky, V. Hyperbolic image embeddings. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 14–19 June 2020; pp. 6418–6428. [Google Scholar]
- Liu, S.; Chen, J.; Pan, L.; Ngo, C.W.; Chua, T.S.; Jiang, Y.G. Hyperbolic visual embedding learning for zero-shot recognition. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 14–19 June 2020; pp. 9273–9281. [Google Scholar]
- Zhang, T.; Zheng, W.; Cui, Z.; Zong, Y.; Li, C.; Zhou, X.; Yang, J. Deep manifold-to-manifold transforming network for skeleton-based action recognition. IEEE Trans. Multimed. 2020, 22, 2926–2937. [Google Scholar] [CrossRef]
- Liu, Q.; Nickel, M.; Kiela, D. Hyperbolic graph neural networks. In Proceedings of the Advances in Neural Information Processing Systems, Vancouver, BC, Canada, 8–14 December 2019; Volume 32. [Google Scholar]
- Chami, I.; Ying, Z.; Ré, C.; Leskovec, J. Hyperbolic graph convolutional neural networks. In Proceedings of the Advances in Neural Information Processing Systems, Vancouver, BC, Canada, 8–14 December 2019; Volume 32. [Google Scholar]
- Liu, W.; Wen, Y.; Yu, Z.; Li, M.; Raj, B.; Song, L. Sphereface: Deep hypersphere embedding for face recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 212–220. [Google Scholar]
- de Ocáriz Borde, H.S.; Kazi, A.; Barbero, F.; Lio, P. Latent graph inference using product manifolds. In Proceedings of the Eleventh International Conference on Learning Representations, Virtual Event, 25–29 April 2022. [Google Scholar]
- Sun, L.; Zhang, Z.; Ye, J.; Peng, H.; Zhang, J.; Su, S.; Philip, S.Y. A self-supervised mixed-curvature graph neural network. Proc. AAAI Conf. Artif. Intell. 2022, 36, 4146–4155. [Google Scholar] [CrossRef]
- Dong, Z.; Jia, S.; Zhang, C.; Pei, M.; Wu, Y. Deep manifold learning of symmetric positive definite matrices with application to face recognition. In Proceedings of the AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, 4–9 February 2017; Volume 31. [Google Scholar]
- Gao, Z.; Wu, Y.; Bu, X.; Yu, T.; Yuan, J.; Jia, Y. Learning a robust representation via a deep network on symmetric positive definite manifolds. Pattern Recognit. 2019, 92, 1–12. [Google Scholar] [CrossRef]
- Brooks, D.A.; Schwander, O.; Barbaresco, F.; Schneider, J.Y.; Cord, M. Exploring complex time-series representations for Riemannian machine learning of radar data. In Proceedings of the ICASSP 2019–2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Brighton, UK, 12–17 May 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 3672–3676. [Google Scholar]
- Huang, Z.; Van Gool, L. A riemannian network for spd matrix learning. In Proceedings of the AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, 4–9 February 2017; Volume 31. [Google Scholar]
- Zhang, T.; Zheng, W.; Cui, Z.; Li, C. Deep manifold-to-manifold transforming network. In Proceedings of the 2018 25th IEEE International Conference on Image Processing (ICIP), Athens, Greece, 7–10 October 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 4098–4102. [Google Scholar]
- Chakraborty, R.; Bouza, J.; Manton, J.H.; Vemuri, B.C. Manifoldnet: A deep neural network for manifold-valued data with applications. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 44, 799–810. [Google Scholar] [CrossRef] [PubMed]
- Chakraborty, R.; Yang, C.H.; Zhen, X.; Banerjee, M.; Archer, D.; Vaillancourt, D.; Singh, V.; Vemuri, B. A statistical recurrent model on the manifold of symmetric positive definite matrices. In Proceedings of the Advances in Neural Information Processing Systems, Montreal, QC, Canada, 3–8 December 2018; Volume 31. [Google Scholar]
- Brooks, D.; Schwander, O.; Barbaresco, F.; Schneider, J.Y.; Cord, M. Riemannian batch normalization for SPD neural networks. In Proceedings of the Advances in Neural Information Processing Systems, Vancouver, BC, Canada, 8–14 December 2019; Volume 32. [Google Scholar]
- Spivak, M. A Comprehensive Introduction to Differential Geometry, Publish or Perish; Berkeley, Inc.: Boise, ID, USA, 1979; Volume 2. [Google Scholar]
- Wu, F.; Souza, A.; Zhang, T.; Fifty, C.; Yu, T.; Weinberger, K. Simplifying graph convolutional networks. In Proceedings of the International Conference on Machine Learning, PMLR, Long Beach, CA, USA, 9–15 June 2019; pp. 6861–6871. [Google Scholar]
- Hamilton, W.; Ying, Z.; Leskovec, J. Inductive representation learning on large graphs. In Proceedings of the Advances in Neural Information Processing Systems, Long Beach, CA, USA, 4–9 December 2017; Volume 30. [Google Scholar]
- Chen, M.; Wei, Z.; Huang, Z.; Ding, B.; Li, Y. Simple and deep graph convolutional networks. In Proceedings of the International Conference on Machine Learning, PMLR, Virtual, 13–18 July 2020; pp. 1725–1735. [Google Scholar]
- Zhu, J.; Yan, Y.; Zhao, L.; Heimann, M.; Akoglu, L.; Koutra, D. Beyond homophily in graph neural networks: Current limitations and effective designs. Adv. Neural Inf. Process. Syst. 2020, 33, 7793–7804. [Google Scholar]
- Zhang, Y.; Wang, X.; Shi, C.; Jiang, X.; Ye, Y. Hyperbolic graph attention network. IEEE Trans. Big Data 2021, 8, 1690–1701. [Google Scholar] [CrossRef]
- Zhang, Y.; Wang, X.; Shi, C.; Liu, N.; Song, G. Lorentzian graph convolutional networks. In Proceedings of the Web Conference 2021, Ljubljana, Slovenia, 19–23 April 2021; pp. 1249–1261. [Google Scholar]
- Chen, W.; Han, X.; Lin, Y.; Zhao, H.; Liu, Z.; Li, P.; Sun, M.; Zhou, J. Fully Hyperbolic Neural Networks. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics, Dublin, Ireland, 22–27 May 2022; pp. 5672–5686. [Google Scholar]
Dataset | #Node | #Feature | #Class | #Edge | -Hyperbolicity |
---|---|---|---|---|---|
Disease | 1044 | 1000 | 2 | 1043 | 0 |
Texas | 183 | 1703 | 5 | 295 | 1 |
Wisconsin | 251 | 1703 | 5 | 466 | 1 |
Cornell | 183 | 1703 | 5 | 280 | 1 |
Squirrel | 5201 | - | 5 | 198,493 | 1.5 |
Chameleon | 2277 | - | 5 | 31,421 | 1.5 |
PubMed | 19,717 | 500 | 3 | 88,651 | 3.5 |
Cora | 2708 | 1433 | 7 | 5429 | 11 |
Model Type | Model | Tree | Heterophily | Neighbor |
---|---|---|---|---|
Shallow model | MLP | × | × | × |
Euclidean GNNs | GCN [12] | × | × | ✓ |
SGC [47] | × | × | ✓ | |
GAT [21] | × | × | ✓ | |
SAGE [48] | × | × | ✓ | |
GCNII [49] | × | × | ✓ | |
GeomGCN [18] | × | ✓ | ✓ | |
H2GCN [50] | × | ✓ | ✓ | |
Hyperbolic GNNs | HGCN [34] | ✓ | × | ✓ |
HAT [51] | ✓ | × | ✓ | |
LGCN [52] | ✓ | × | ✓ | |
HyboNet [53] | ✓ | × | ✓ | |
SPD model | RGCN | ✓ | ✓ | ✓ |
Space | Model | Disease | Airport | PubMed | Cora |
---|---|---|---|---|---|
Euclidean | GCN [12] | 69.7 ± 0.4 | 81.4 ± 0.6 | 78.1 ± 0.2 | 81.3 ± 0.3 |
GAT [21] | 70.4 ± 0.4 | 81.5 ± 0.3 | 79.0 ± 0.3 | 83.0 ± 0.7 | |
SAGE [48] | 69.1 ± 0.6 | 82.1 ± 0.5 | 77.4 ± 2.2 | 77.9 ± 2.4 | |
SGC [47] | 69.5 ± 0.2 | 80.6 ± 0.1 | 78.9 ± 0.0 | 81.0 ± 0.1 | |
Hyperbolic | HGCN [34] | 82.8 ± 0.8 | 90.6 ± 0.2 | 78.4 ± 0.4 | 81.3 ± 0.6 |
HAT [51] | 83.6 ± 0.9 | – | 78.6 ± 0.5 | 83.1 ± 0.6 | |
LGCN [52] | 84.4 ± 0.8 | 90.9 ± 1.7 | 78.6 ± 0.7 | 83.3 ± 0.7 | |
HyboNet [53] | 96.0 ± 1.0 | 90.9 ± 1.4 | 78.0 ± 1.0 | 80.2 ± 1.3 | |
SPD | RGCN | 96.9 ± 0.9 | 92.6 ± 1.8 | 79.4 ± 1.2 | 80.5 ± 1.5 |
Dataset | Texas | Wisconsin | Cornell | Squirrel | Chameleon |
---|---|---|---|---|---|
MLP | 80.8 ± 4.7 | 85.2 ± 3.3 | 81.9 ± 6.4 | 63.6 ± 2.1 | 72.8 ± 1.5 |
GCN [12] | 55.1 ± 5.1 | 51.7 ± 3.0 | 60.5 ± 5.3 | 38.2 ± 1.6 | 60.4 ± 2.1 |
SAGE [48] | 82.4 ± 6.1 | 81.1 ± 5.5 | 75.9 ± 5.0 | 41.6 ± 0.7 | 58.7 ± 1.6 |
GeomGCN [18] | 66.7 ± 2.7 | 64.5 ± 3.6 | 60.5 ± 3.6 | 38.1 ± 0.9 | 60.0 ± 2.8 |
GCNII [49] | 77.5 ± 3.8 | 80.3 ± 3.4 | 77.8 ± 3.7 | 56.6 ± 2.1 | 67.3 ± 2.4 |
H2GCN [50] | 84.8 ± 7.2 | 86.6 ± 4.6 | 82.7 ± 5.2 | 51.0 ± 4.2 | 69.5 ± 1.8 |
HGCN [34] | 70.1 ± 3.3 | 83.2 ± 4.5 | 79.4 ± 4.4 | 62.3 ± 1.5 | 74.9 ± 1.5 |
HyboNet [53] | 72.2 ± 4.9 | 86.5 ± 4.5 | 77.2 ± 4.7 | 69.1 ± 1.6 | 78.7 ± 0.9 |
RGCN | 89.9 ± 6.6 | 88.7 ± 3.8 | 85.9 ± 5.1 | 75.3 ± 1.4 | 80.3 ± 0.6 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wu, Y.; Hu, L.; Hu, J. Modeling Tree-like Heterophily on Symmetric Matrix Manifolds. Entropy 2024, 26, 377. https://doi.org/10.3390/e26050377
Wu Y, Hu L, Hu J. Modeling Tree-like Heterophily on Symmetric Matrix Manifolds. Entropy. 2024; 26(5):377. https://doi.org/10.3390/e26050377
Chicago/Turabian StyleWu, Yang, Liang Hu, and Juncheng Hu. 2024. "Modeling Tree-like Heterophily on Symmetric Matrix Manifolds" Entropy 26, no. 5: 377. https://doi.org/10.3390/e26050377
APA StyleWu, Y., Hu, L., & Hu, J. (2024). Modeling Tree-like Heterophily on Symmetric Matrix Manifolds. Entropy, 26(5), 377. https://doi.org/10.3390/e26050377