Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3414274.3414490acmotherconferencesArticle/Chapter ViewAbstractPublication PagesdsitConference Proceedingsconference-collections
research-article

Graph Pooling in Graph Neural Networks with Node Feature Correlation

Published: 26 August 2020 Publication History

Abstract

Because of the excellent performance of convolutional neural network in computer vision and natural language processing, we extend convolution operation to graph data and to define it as graph convolution. Different from node classification tasks, graph classification tasks need to pay attention to global information of graphs, which requires graph pooling mechanism to extract global information. Recently, Many researchers are devoted to the study of graph pooling and then proposed diversity of graph pooling models. However, in the graph classification tasks, these graph pooling methods are general and the graph classification accuracy still has room to improvement. Therefore, we propose the covariance pooling (CovPooling) to improve the classification accuracy of graph data sets. CovPooling uses node feature correlation to learn hierarchical representation of a graph. Our graph pooling utilizes node information and graph topology. Experiments show that our pooling module can be integrated into multiple graph convolution layers and achieve state-of-the-art performance in some datasets.

References

[1]
R. M. Haralick, K. Shanmugam, I. H. J. I. T. o. s. Dinstein, man, and cybernetics, "Textural features for image classification," no. 6, pp. 610--621, 1973.
[2]
Y. Kim, "Convolutional Neural Networks for Sentence Classification." pp. 1746--1751.
[3]
E. Zheleva, L. Getoor, J. Golbeck, and U. Kuter, "Using friendship ties and family circles for link prediction." pp. 97--113.
[4]
J. Bruna, W. Zaremba, A. Szlam, and Y. J. a. L. Lecun, "Spectral Networks and Locally Connected Networks on Graphs," 2013.
[5]
M. Defferrard, X. Bresson, and P. Vandergheynst, "Convolutional neural networks on graphs with fast localized spectral filtering." pp. 3844--3852.
[6]
T. Kipf, and M. J. a. L. Welling, "Semi-Supervised Classification with Graph Convolutional Networks," 2016.
[7]
W. L. Hamilton, Z. Ying, and J. Leskovec, "Inductive Representation Learning on Large Graphs." pp. 1024--1034.
[8]
K. Xu, W. Hu, J. Leskovec, and S. Jegelka, "How Powerful are Graph Neural Networks."
[9]
H. Gao, and S. Ji, "Graph U-Nets." pp. 2083--2092.
[10]
M. Zhang, Z. Cui, M. Neumann, and C. Yixin, "An End-to-End Deep Learning Architecture for Graph Classification." pp. 4438--4445.
[11]
J. Lee, I. Lee, and J. Kang, "Self-Attention Graph Pooling." pp. 3734--3743.
[12]
F. J. a. L. Diehl, "Edge Contraction Pooling for Graph Neural Networks," 2019.
[13]
R. Ying, J. You, C. Morris, X. Ren, W. L. Hamilton, and J. J. a. L. Leskovec, "Hierarchical Graph Representation Learning with Differentiable Pooling," 2018.
[14]
E. Ranjan, S. Sanyal, and P. J. a. L. Talukdar, "ASAP: Adaptive Structure Aware Pooling for Learning Hierarchical Graph Representations," 2019.
[15]
M. Niepert, M. H. Ahmed, and K. Kutzkov, "Learning convolutional neural networks for graphs." pp. 2014--2023.
[16]
C. Morris, M. Ritzert, M. Fey, W. L. Hamilton, J. E. Lenssen, G. Rattan, and M. Grohe, "Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks." pp. 4602--4609.
[17]
P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, "Graph Attention Networks."
[18]
P. D. Dobson, and A. J. J. J. o. M. B. Doig, "Distinguishing enzyme structures from non-enzymes without alignments," vol. 330, no. 4, pp. 771--783, 2003.
[19]
N. Wale, and G. Karypis, "Comparison of Descriptor Spaces for Chemical Compound Retrieval and Classification." pp. 347--375.
[20]
N. M. Kriege, and P. Mutzel, "Subgraph Matching Kernels for Attributed Graphs." pp. 291--298.
[21]
Y. Ma, S. Wang, C. C. Aggarwal, and J. Tang, "Graph Convolutional Networks with EigenPooling." pp. 723--731.

Cited By

View all
  • (2024)Graph pooling in graph neural networks: methods and their applications in omics studiesArtificial Intelligence Review10.1007/s10462-024-10918-957:11Online publication date: 16-Sep-2024
  • (2022)Fea2Fea: Exploring Structural Feature Correlations via Graph Neural NetworksMachine Learning and Principles and Practice of Knowledge Discovery in Databases10.1007/978-3-030-93736-2_19(238-257)Online publication date: 17-Feb-2022

Index Terms

  1. Graph Pooling in Graph Neural Networks with Node Feature Correlation

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    DSIT 2020: Proceedings of the 3rd International Conference on Data Science and Information Technology
    July 2020
    261 pages
    ISBN:9781450376044
    DOI:10.1145/3414274
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    • Natl University of Singapore: National University of Singapore
    • SKKU: SUNGKYUNKWAN UNIVERSITY

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 August 2020

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Convolution neural network
    2. covariance pooling (CovPooling)
    3. graph classification
    4. hierarchical graph representations

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    • the Scientific and Technological Projects of Qingyuan
    • the Nature Science Foundation of China
    • Guangdong Provincial Key Laboratory Project of Intellectual Property and Big Data

    Conference

    DSIT 2020

    Acceptance Rates

    DSIT 2020 Paper Acceptance Rate 40 of 97 submissions, 41%;
    Overall Acceptance Rate 114 of 277 submissions, 41%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)16
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 03 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Graph pooling in graph neural networks: methods and their applications in omics studiesArtificial Intelligence Review10.1007/s10462-024-10918-957:11Online publication date: 16-Sep-2024
    • (2022)Fea2Fea: Exploring Structural Feature Correlations via Graph Neural NetworksMachine Learning and Principles and Practice of Knowledge Discovery in Databases10.1007/978-3-030-93736-2_19(238-257)Online publication date: 17-Feb-2022

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media