Nothing Special   »   [go: up one dir, main page]

Skip to main content

Edge but not Least: Cross-View Graph Pooling

  • Conference paper
  • First Online:
Machine Learning and Knowledge Discovery in Databases (ECML PKDD 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13714))

Abstract

Graph neural networks have emerged as a powerful representation learning model for undertaking various graph prediction tasks. Various graph pooling methods have been developed to coarsen an input graph into a succinct graph-level representation through aggregating node embeddings obtained via graph convolution. However, because most graph pooling methods are heavily node-centric, they fail to fully leverage the crucial information contained in graph structure. This paper presents a cross-view graph pooling method (Co-Pooling) that explicitly exploits crucial graph substructures for learning graph representations. Co-Pooling is designed to fuse the pooled representations from both node view and edge view. Through cross-view interaction, edge-view pooling and node-view pooling mutually reinforce each other to learn informative graph representations. Extensive experiments on one synthetic and 15 real-world graph datasets validate the effectiveness of our Co-Pooling method. Our results and analysis show that (1) our method is able to yield promising results over graphs with various types of node attributes, and (2) our method can achieve superior performance over state-of-the-art pooling methods on graph classification and regression tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Chen, Z., Chen, L., Villar, S., Bruna, J.: Can graph neural networks count substructures? NeurIPS 33, 10383–10395 (2020)

    Google Scholar 

  2. Chien, E., Peng, J., Li, P., Milenkovic, O.: Adaptive universal generalized pagerank graph neural network. In: ICLR (2021)

    Google Scholar 

  3. Diehl, F.: Edge contraction pooling for graph neural networks. arXiv preprint arXiv:1905.10990 (2019)

  4. Dwivedi, V.P., Joshi, C.K., Laurent, T., Bengio, Y., Bresson, X.: Benchmarking graph neural networks. arXiv preprint arXiv:2003.00982 (2020)

  5. Galland, A.: Graph pooling by edge cut (2021)

    Google Scholar 

  6. Gao, H., Ji, S.: Graph u-nets. In: ICML, pp. 2083–2092. PMLR (2019)

    Google Scholar 

  7. Gao, X., Dai, W., Li, C., Xiong, H., Frossard, P.: iPool-information-based pooling in hierarchical graph neural networks. IEEE TNNLS 33, 1–13 (2021)

    Google Scholar 

  8. Hamilton, W.L., Ying, R., Leskovec, J.: Inductive representation learning on large graphs. In: NIPS, pp. 1025–1035 (2017)

    Google Scholar 

  9. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: ICLR (2017)

    Google Scholar 

  10. Lee, J., Lee, I., Kang, J.: Self-attention graph pooling. In: ICML, pp. 3734–3743 (2019)

    Google Scholar 

  11. Liu, N., Jian, S., Li, D., Zhang, Y., Lai, Z., Xu, H.: Hierarchical adaptive pooling by capturing high-order dependency for graph representation learning. IEEE TKDE (2021)

    Google Scholar 

  12. Morris, C., Kriege, N., Bause, F., Kersting, K., Mutzel, P., Neumann, M.: Tudataset: a collection of benchmark datasets for learning with graphs. arXiv:2007.08663 (2020)

  13. Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., Alon, U.: Network motifs: simple building blocks of complex networks. Science 298, 824–827 (2002)

    Article  Google Scholar 

  14. Orsini, F., Frasconi, P., De Raedt, L.: Graph invariant kernels. In: IJCAI, pp. 3756–3762 (2015)

    Google Scholar 

  15. Ranjan, E., Sanyal, S., Talukdar, P.: ASAP: adaptive structure aware pooling for learning hierarchical graph representations. In: AAAI, pp. 5470–5477 (2020)

    Google Scholar 

  16. Riesen, K., Bunke, H.: IAM graph database repository for graph based pattern recognition and machine learning. In: da Vitoria Lobo, N., et al. (eds.) SSPR /SPR 2008. LNCS, vol. 5342, pp. 287–297. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-89689-0_33

    Chapter  Google Scholar 

  17. Shang, J., et al.: Assembling molecular sierpiński triangle fractals. Nat. Chem. 7(5), 389–393 (2015)

    Article  Google Scholar 

  18. Sun, Q., et al.: Sugar: subgraph neural network with reinforcement pooling and self-supervised mutual information mechanism. In: Proceedings of the Web Conference 2021, pp. 2081–2091 (2021)

    Google Scholar 

  19. Sutherland, J.J., O’brien, L.A., Weaver, D.F.: Spline-fitting with a genetic algorithm: a method for developing classification structure-activity relationships. J. Chem. Inf. Comput. Sci. 43(6), 1906–1915 (2003)

    Google Scholar 

  20. Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y.: Graph attention networks. In: ICLR (2018)

    Google Scholar 

  21. Wang, Y.G., Li, M., Ma, Z., Montufar, G., Zhuang, X., Fan, Y.: Haar graph pooling. In: ICML, pp. 9952–9962 (2020)

    Google Scholar 

  22. Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? In: ICLR (2019)

    Google Scholar 

  23. Yanardag, P., Vishwanathan, S.: Deep graph kernels. In: SIGKDD, pp. 1365–1374 (2015)

    Google Scholar 

  24. Ying, R., You, J., Morris, C., Ren, X., Hamilton, W.L., Leskovec, J.: Hierarchical graph representation learning with differentiable pooling. In: NIPS, pp. 4805–4815 (2018)

    Google Scholar 

  25. Yuan, H., Ji, S.: Structpool: structured graph pooling via conditional random fields. In: ICLR (2020)

    Google Scholar 

  26. Zhang, Z., et al.: Hierarchical graph pooling with structure learning. arXiv preprint arXiv:1911.05954 (2019)

  27. Zhang, M., Cui, Z., Neumann, M., Chen, Y.: An end-to-end deep learning architecture for graph classification. In: AAAI, pp. 4438–4445 (2018)

    Google Scholar 

Download references

Acknowledgements

Xiaowei Zhou is supported by a Data61 PhD Scholarship from CSIRO. Ivor W. Tsang is supported by the Center for Frontier AI research, A*STAR, and ARC under grants DP200101328. This work is partially supported by the USYD-Data61 Collaborative Research Project grant.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jie Yin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhou, X., Yin, J., Tsang, I.W. (2023). Edge but not Least: Cross-View Graph Pooling. In: Amini, MR., Canu, S., Fischer, A., Guns, T., Kralj Novak, P., Tsoumakas, G. (eds) Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2022. Lecture Notes in Computer Science(), vol 13714. Springer, Cham. https://doi.org/10.1007/978-3-031-26390-3_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-26390-3_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-26389-7

  • Online ISBN: 978-3-031-26390-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics