Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3447548.3467312acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
research-article

DeGNN: Improving Graph Neural Networks with Graph Decomposition

Published: 14 August 2021 Publication History

Abstract

Mining from graph-structured data is an integral component of graph data management. A recent trending technique, graph convolutional network (GCN), has gained momentum in the graph mining field, and plays an essential part in numerous graph-related tasks. Although the emerging GCN optimization techniques bring improvements to specific scenarios, they perform diversely in different applications and introduce many trial-and-error costs for practitioners. Moreover, existing GCN models often suffer from oversmoothing problem. Besides, the entanglement of various graph patterns could lead to non-robustness and harm the final performance of GCNs. In this work, we propose a simple yet efficient graph decomposition approach to improve the performance of general graph neural networks. We first empirically study existing graph decomposition methods and propose an automatic connectivity-ware graph decomposition algorithm, DeGNN. To provide a theoretical explanation, we then characterize GCN from the information-theoretic perspective and show that under certain conditions, the mutual information between the output after l layers and the input of GCN converges to 0 exponentially with respect to l. On the other hand, we show that graph decomposition can potentially weaken the condition of such convergence rate, alleviating the information loss when GCN becomes deeper. Extensive experiments on various academic benchmarks and real-world production datasets demonstrate that graph decomposition generally boosts the performance of GNN models. Moreover, our proposed solution DeGNN achieves state-of-the-art performances on almost all these tasks.

Supplementary Material

MP4 File (KDD21-rst2081.mp4)
Presentation video

References

[1]
2020. DropEdge openreview. https://openreview.net/forum?id=Hkx1qkrKPr
[2]
Sami Abu-El-Haija, Amol Kapoor, Bryan Perozzi, and Joonseok Lee. 2019. N-GCN: Multi-scale Graph Convolution for Semi-supervised Node Classification. In UAI.
[3]
Rianne van den Berg, Thomas N Kipf, and Max Welling. 2017. Graph convolutional matrix completion. arXiv preprint arXiv:1706.02263 (2017).
[4]
Shaosheng Cao, Xinxing Yang, Cen Chen, Jun Zhou, Xiaolong Li, and Yuan Qi. 2019. TitAnt: Online Real-time Transaction Fraud Detection in Ant Financial. PVLDB, Vol. 12, 12 (2019), 2082--2093.
[5]
Jie Chen, Tengfei Ma, and Cao Xiao. 2018. FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling. In ICLR.
[6]
Wei-Lin Chiang, Xuanqing Liu, Si Si, Yang Li, Samy Bengio, and Cho-Jui Hsieh. 2019. Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks. In SIGKDD. 257--266.
[7]
Yodsawalai Chodpathumwan, Amirhossein Aleyasen, Arash Termehchy, and Yizhou Sun. 2015. Universal-DB: Towards Representation Independent Graph Analytics. VLDB, Vol. 8, 12 (2015), 2016--2019.
[8]
Wenfei Fan, Kun He, Qian Li, and Yue Wang. 2020. Graph algorithms: parallelization and scalability. Sci. China Inf. Sci., Vol. 63, 10 (2020), 1--21.
[9]
Hongyang Gao, Zhengyang Wang, and Shuiwang Ji. 2018. Large-scale learnable graph convolutional networks. In SIGKDD. 1416--1424.
[10]
Joseph E. Gonzalez, Yucheng Low, Haijie Gu, Danny Bickson, and Carlos Guestrin. 2012. PowerGraph: Distributed Graph-Parallel Computation on Natural Graphs. In OSDI. 17--30.
[11]
Will Hamilton, Zhitao Ying, and Jure Leskovec. 2017. Inductive representation learning on large graphs. In NeurIPS. 1024--1034.
[12]
Louis Jachiet, Pierre Genevè s, Nils Gesbert, and Nabil Layaïda. 2020. On the Optimization of Recursive Relational Queries: Application to Graph Queries. In SIGMOD. 681--697.
[13]
Chathura Kankanamge, Siddhartha Sahu, Amine Mhedbhi, Jeremy Chen, and Semih Salihoglu. 2017. Graphflow: An Active Graph Database. In SIGMOD. 1695--1698.
[14]
George Karypis and Vipin Kumar. 1998. A fast and high quality multilevel scheme for partitioning irregular graphs. SIAM Journal on scientific Computing, Vol. 20, 1 (1998), 359--392.
[15]
Thomas N. Kipf and Max Welling. 2017. Semi-Supervised Classification with Graph Convolutional Networks. In ICLR.
[16]
Johannes Klicpera, Aleksandar Bojchevski, and Stephan Gü nnemann. 2019. Predict then Propagate: Graph Neural Networks meet Personalized PageRank. In ICLR.
[17]
Boris Knyazev, Xiao Lin, Mohamed R Amer, and Graham W Taylor. 2018. Spectral Multigraph Networks for Discovering and Fusing Relationships in Molecules. arXiv preprint arXiv:1811.09595 (2018).
[18]
Guohao Li, Matthias Müller, Ali Thabet, and Bernard Ghanem. 2019. Can GCNs Go as Deep as CNNs? arXiv preprint arXiv:1904.03751 (2019).
[19]
Qimai Li, Zhichao Han, and Xiao-Ming Wu. 2018. Deeper insights into graph convolutional networks for semi-supervised learning. In AAAI.
[20]
Renjie Liao, Marc Brockschmidt, Daniel Tarlow, Alexander L. Gaunt, Raquel Urtasun, and Richard S. Zemel. 2018. Graph Partition Neural Networks for Semi-Supervised Classification. In ICLR.
[21]
Ziqi Liu, Zhengwei Wu, Zhiqiang Zhang, Jun Zhou, Shuang Yang, Le Song, and Yuan Qi. 2020. Bandit Samplers for Training Graph Neural Networks. In NeurIPS.
[22]
Sitao Luan, Mingde Zhao, Xiao-Wen Chang, and Doina Precup. 2019. Break the Ceiling: Stronger Multi-scale Deep Graph Convolutional Networks. In NeurIPS.
[23]
Jianxin Ma, Peng Cui, Kun Kuang, Xin Wang, and Wenwu Zhu. 2019. Disentangled Graph Convolutional Networks. In ICML, Vol. 97. 4212--4221.
[24]
X. Miao, L. Ma, Z. Yang, Y. Shao, B. Cui, L. Yu, and J. Jiang. 2020. CuWide: Towards Efficient Flow-based Training for Sparse Wide Models on GPUs. TKDE (2020), 1--1. https://doi.org/10.1109/TKDE.2020.3038109
[25]
Kenta Oono and Taiji Suzuki. 2019. On Asymptotic Behaviors of Graph CNNs from Dynamical Systems Perspective. arXiv preprint arXiv:1905.10947 (2019).
[26]
Chanyoung Park, Carl Yang, Qi Zhu, Donghyun Kim, Hwanjo Yu, and Jiawei Han. 2020. Unsupervised Differentiable Multi-aspect Network Embedding. In SIGKDD. 1435--1445.
[27]
Zhen Peng, Wenbing Huang, Minnan Luo, Qinghua Zheng, Yu Rong, Tingyang Xu, and Junzhou Huang. 2020. Graph Representation Learning via Graphical Mutual Information Maximization. In WWW. 259--270.
[28]
Hannu Reittu, Ilkka Norros, Tomi R228;ty, Marianna Bolla, and Fülöp Bazsó. 2019. Regular Decomposition of Large Graphs: Foundation of a Sampling Approach to Stochastic Block Model Fitting. Data Sci. Eng., Vol. 4, 1 (2019), 44--60.
[29]
Yu Rong, Wenbing Huang, Tingyang Xu, and Junzhou Huang. 2019. DropEdge: Towards Deep Graph Convolutional Networks on Node Classification. In ICLR.
[30]
Andrew M Saxe, Yamini Bansal, Joel Dapello, Madhu Advani, Artemy Kolchinsky, Brendan D Tracey, and David D Cox. 2019. On the information bottleneck theory of deep learning. Journal of Statistical Mechanics: Theory and Experiment, Vol. 2019, 12 (2019), 124020.
[31]
Ohad Shamir, Sivan Sabato, and Naftali Tishby. 2010. Learning and generalization with the information bottleneck. Theoretical Computer Science, Vol. 411, 29--30 (2010).
[32]
Felipe Petroski Such, Shagan Sah, Miguel Alexander Dominguez, Suhas Pillai, Chao Zhang, Andrew Michael, Nathan D Cahill, and Raymond Ptucha. 2017. Robust spatial filtering with graph convolutional neural networks. IEEE Journal of Selected Topics in Signal Processing, Vol. 11, 6 (2017), 884--896.
[33]
Emre Telatar. 1999. Capacity of multi?antenna Gaussian channels. European transactions on telecommunications, Vol. 10 (1999), 585--595. Issue 6.
[34]
Petar Velivc ković, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Lio, and Yoshua Bengio. 2018. Graph attention networks. ICLR (2018).
[35]
Petar Velickovic, William Fedus, William L. Hamilton, Pietro Liò, Yoshua Bengio, and R. Devon Hjelm. 2019. Deep Graph Infomax. In ICLR.
[36]
Felix Wu, Amauri H. Souza Jr., Tianyi Zhang, Christopher Fifty, Tao Yu, and Kilian Q. Weinberger. 2019 a. Simplifying Graph Convolutional Networks. In ICML.
[37]
Shiwen Wu, Yuanxing Zhang, Chengliang Gao, Kaigui Bian, and Bin Cui. 2020. GARG: Anonymous Recommendation of Point-of-Interest in Mobile Networks by Graph Convolution Network. Data Sci. Eng., Vol. 5, 4 (2020), 433--447.
[38]
Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, and Philip S Yu. 2019 b. A comprehensive survey on graph neural networks. arXiv preprint arXiv:1901.00596 (2019).
[39]
Keyulu Xu, Weihua Hu, Jure Leskovec, and Stefanie Jegelka. 2019. How Powerful are Graph Neural Networks?. In ICLR.
[40]
Keyulu Xu, Chengtao Li, Yonglong Tian, Tomohiro Sonobe, Ken-ichi Kawarabayashi, and Stefanie Jegelka. 2018. Representation Learning on Graphs with Jumping Knowledge Networks. In ICML. 5449--5458.
[41]
Hanqing Zeng, Hongkuan Zhou, Ajitesh Srivastava, Rajgopal Kannan, and Viktor K. Prasanna. 2020. GraphSAINT: Graph Sampling Based Inductive Learning Method. In ICLR.
[42]
Wentao Zhang, Xupeng Miao, Yingxia Shao, Jiawei Jiang, Lei Chen, Olivier Ruas, and Bin Cui. 2020. Reliable Data Distillation on Graph Convolutional Network. In SIGMOD. 1399--1414.
[43]
Da Zheng, Minjie Wang, Quan Gan, Zheng Zhang, and George Karypis. 2020. Scalable Graph Neural Networks with Deep Graph Library. In SIGKDD. 3521--3522.
[44]
Hao Zhong and Hong Mei. 2020. Learning a graph-based classifier for fault localization. Sci. China Inf. Sci., Vol. 63, 6 (2020).
[45]
Rong Zhu, Kun Zhao, Hongxia Yang, Wei Lin, Chang Zhou, Baole Ai, Yong Li, and Jingren Zhou. 2019. AliGraph: A Comprehensive Graph Neural Network Platform. VLDB, Vol. 12, 12 (2019), 2094--2105.
[46]
Chenyi Zhuang and Qiang Ma. 2018. Dual graph convolutional networks for graph-based semi-supervised classification. In WWW. 499--508.

Cited By

View all
  • (2024)Time series forecasting model for non-stationary series pattern extraction using deep learning and GARCH modelingJournal of Cloud Computing10.1186/s13677-023-00576-713:1Online publication date: 2-Jan-2024
  • (2024)A Survey on Graph Representation Learning MethodsACM Transactions on Intelligent Systems and Technology10.1145/363351815:1(1-55)Online publication date: 16-Jan-2024
  • (2024)LearnSC: An Efficient and Unified Learning-Based Framework for Subgraph Counting Problem2024 IEEE 40th International Conference on Data Engineering (ICDE)10.1109/ICDE60146.2024.00206(2625-2638)Online publication date: 13-May-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
KDD '21: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining
August 2021
4259 pages
ISBN:9781450383325
DOI:10.1145/3447548
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 August 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. graph decomposition
  2. graph neural network
  3. information loss

Qualifiers

  • Research-article

Conference

KDD '21
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,133 of 8,635 submissions, 13%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)139
  • Downloads (Last 6 weeks)12
Reflects downloads up to 02 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Time series forecasting model for non-stationary series pattern extraction using deep learning and GARCH modelingJournal of Cloud Computing10.1186/s13677-023-00576-713:1Online publication date: 2-Jan-2024
  • (2024)A Survey on Graph Representation Learning MethodsACM Transactions on Intelligent Systems and Technology10.1145/363351815:1(1-55)Online publication date: 16-Jan-2024
  • (2024)LearnSC: An Efficient and Unified Learning-Based Framework for Subgraph Counting Problem2024 IEEE 40th International Conference on Data Engineering (ICDE)10.1109/ICDE60146.2024.00206(2625-2638)Online publication date: 13-May-2024
  • (2024)Harnessing the Power of Graph Representation in Climate Forecasting: Predicting Global Monthly Mean Sea Surface Temperatures and AnomaliesEarth and Space Science10.1029/2023EA00345511:3Online publication date: 21-Mar-2024
  • (2024)A multisensory Interaction Framework for Human-Cyber–Physical System based on Graph Convolutional NetworksAdvanced Engineering Informatics10.1016/j.aei.2024.10248261(102482)Online publication date: Aug-2024
  • (2024)Tackling Oversmoothing in GNN via Graph SparsificationMachine Learning and Knowledge Discovery in Databases. Research Track and Demo Track10.1007/978-3-031-70371-3_10(161-179)Online publication date: 22-Aug-2024
  • (2023)Scapin: Scalable Graph Structure Perturbation by Augmented Influence MaximizationProceedings of the ACM on Management of Data10.1145/35892911:2(1-21)Online publication date: 20-Jun-2023
  • (2023)k-Hopped Link Prediction With Graph Embedding2023 Congress in Computer Science, Computer Engineering, & Applied Computing (CSCE)10.1109/CSCE60160.2023.00104(600-607)Online publication date: 24-Jul-2023
  • (2022)Multi-Feature Behavior Relationship for Multi-Behavior RecommendationApplied Sciences10.3390/app12241290912:24(12909)Online publication date: 15-Dec-2022
  • (2022)TSPLIT: Fine-grained GPU Memory Management for Efficient DNN Training via Tensor Splitting2022 IEEE 38th International Conference on Data Engineering (ICDE)10.1109/ICDE53745.2022.00241(2615-2628)Online publication date: May-2022
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media