Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3449639.3459318acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Fitness landscape analysis of graph neural network architecture search spaces

Published: 26 June 2021 Publication History

Abstract

Neural Architecture Search (NAS) is the name given to a set of methods designed to automatically configure the layout of neural networks. Their success on Convolutional Neural Networks inspired its use on optimizing other types of neural network architectures, including Graph Neural Networks (GNNs). GNNs have been extensively applied over several collections of real-world data, achieving state-of-the-art results in tasks such as circuit design, molecular structure generation and anomaly detection. Many GNN models have been recently proposed, and choosing the best model for each problem has become a cumbersome and error-prone task. Aiming to alleviate this problem, recent works have proposed strategies for applying NAS to GNN models. However, different search methods converge relatively fast in the search for a good architecture, which raises questions about the structure of the problem. In this work we use Fitness Landscape Analysis (FLA) measures to characterize the search space explored by NAS methods for GNNs. We sample almost 90k different architectures that cover most of the fitness range, and represent them using both a one-hot encoding and an embedding representation. Results of the fitness distance correlation and dispersion metrics show the fitness landscape is easy to be explored, and presents low neutrality.

References

[1]
Anna Sergeevna Bosman, Andries P. Engelbrecht, and Mardé Helbig. 2016. Search space boundaries in neural network error landscape analysis. In IEEE Symposium Series on Computational Intelligence (SSCI). 1--8.
[2]
Thomas Elsken, Jan Hendrik Metzen, and Frank Hutter. 2019. Neural Architecture Search: A Survey. Journal of Machine Learning Research (JMLR) 20 (2019), 55:1--55:21.
[3]
Yang Gao, Hong Yang, Peng Zhang, Chuan Zhou, and Yue Hu. 2020. Graph neural architecture search. In International Joint Conference on Artificial Intelligence (IJCAI), Vol. 20. 1403--1409.
[4]
Unai Garciarena, Roberto Santana, and Alexander Mendiburu. 2018. Analysis of the complexity of the automatic pipeline generation problem. In IEEE Congress on Evolutionary Computation (CEC). IEEE, 1--8.
[5]
William L. Hamilton, Zhitao Ying, and Jure Leskovec. 2017. Inductive Representation Learning on Large Graphs. In Advances in Neural Information Processing Systems (NeurIPS). 1024--1034.
[6]
John T. Hancock and Taghi M. Khoshgoftaar. 2020. Survey on categorical data for neural networks. Journal of Big Data 7, 1 (2020), 28.
[7]
Weihua Hu, Matthias Fey, Marinka Zitnik, Yuxiao Dong, Hongyu Ren, Bowen Liu, Michele Catasta, and Jure Leskovec. 2020. Open Graph Benchmark: Datasets for Machine Learning on Graphs. In Advances in Neural Information Processing Systems(NeurIPS).
[8]
Shengli Jiang and Prasanna Balaprakash. 2020. Graph Neural Network Architecture Search for Molecular Property Prediction. CoRR abs/2008.12187 (2020). arXiv:2008.12187
[9]
Terry Jones and Stephanie Forrest. 1995. Fitness Distance Correlation as a Measure of Problem Difficulty for Genetic Algorithms. In International Conference on Genetic Algorithms (ICGA). 184--192.
[10]
Thomas N. Kipf and Max Welling. 2017. Semi-Supervised Classification with Graph Convolutional Networks. In International Conference on Learning Representations (ICLR).
[11]
Kwei-Herng Lai, Daochen Zha, Kaixiong Zhou, and Xia Hu. 2020. Policy-GNN: Aggregation Optimization for Graph Neural Networks. In ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 461--471.
[12]
Hao Li, Zheng Xu, Gavin Taylor, Christoph Studer, and Tom Goldstein. 2018. Visualizing the Loss Landscape of Neural Nets. In Advances in Neural Information Processing Systems (NeurIPS). 6391--6401.
[13]
Yuening Li, Xiao Huang, Jundong Li, Mengnan Du, and Na Zou. 2019. SpecAE: Spectral AutoEncoder for Anomaly Detection in Attributed Networks. In International Conference on Information and Knowledge Management (CIKM). 2233--2236.
[14]
Yaoman Li and Irwin King. 2020. AutoGraph: Automated Graph Neural Network. In International Conference in Neural Information Processing (ICONIP) (Lecture Notes in Computer Science, Vol. 12533). Springer, 189--201.
[15]
Monte Lunacek and L. Darrell Whitley. 2006. The dispersion metric and the CMA evolution strategy. In Genetic and Evolutionary Computation Conference (GECCO). 477--484.
[16]
Katherine Malan and Andries Petrus Engelbrecht. 2013. A survey of techniques for characterising fitness landscapes and some possible ways forward. Information Sciences 241 (2013), 148--163.
[17]
Katherine Malan and Andries Petrus Engelbrecht. 2014. Characterising the searchability of continuous optimisation problems for PSO. Swarm Intelligence 8, 4 (2014), 275--302.
[18]
Matheus Nunes and Gisele L. Pappa. 2020. Neural Architecture Search in Graph Neural Networks. In 9th Brazilian Conference on Intelligent Systems (BRACIS) (Lecture Notes in Computer Science, Vol. 12319). Springer, 302--317.
[19]
Cristiano Guimarães Pimenta, Alex Guimarães Cardoso de Sá, Gabriela Ochoa, and Gisele L. Pappa. 2020. Fitness Landscape Analysis of Automated Machine Learning Search Spaces. In Evolutionary Computation in Combinatorial Optimization (EvoCOP), Vol. 12102. 114--130.
[20]
Erik Pitzer and Michael Affenzeller. 2012. A Comprehensive Survey on Fitness Landscape Analysis. In Recent Advances in Intelligent Engineering Systems. Studies in Computational Intelligence, Vol. 378. 161--191.
[21]
Anna S. Rakitianskaia, Eduan Bekker, Katherine Malan, and Andries P. Engelbrecht. 2016. Analysis of error landscapes in multi-layered neural networks for classification. In IEEE Congress on Evolutionary Computation (CEC). 5270--5277.
[22]
Esteban Real, Alok Aggarwal, Yanping Huang, and Quoc V. Le. 2019. Regularized Evolution for Image Classifier Architecture Search. In The Thirty-Third AAAI Conference on Artificial Intelligence, (AAAI). 4780--4789.
[23]
Christian M. Reidys and Peter F. Stadler. 2001. Neutrality in fitness landscapes. Appl. Math. Comput. 117, 2-3 (2001), 321--350.
[24]
Nuno M. Rodrigues, Sara Silva, and Leonardo Vanneschi. 2020. A Study of Fitness Landscapes for Neuroevolution. In IEEE Congress on Evolutionary Computation (CEC). 1--8.
[25]
Steven Roman. 1992. Coding and information theory. Vol. 134. Springer Science & Business Media.
[26]
Franco Scarselli, Marco Gori, Ah Chung Tsoi, Markus Hagenbuchner, and Gabriele Monfardini. 2008. The graph neural network model. IEEE Transactions on Neural Networks (TNN) 20, 1 (2008), 61--80.
[27]
Prithviraj Sen, Galileo Namata, Mustafa Bilgic, Lise Getoor, Brian Gallagher, and Tina Eliassi-Rad. 2008. Collective Classification in Network Data. AI Magazine 29, 3 (2008), 93--106.
[28]
Willem Abraham van Aardt, Anna Sergeevna Bosman, and Katherine Mary Malan. 2017. Characterising neutrality in neural network error landscapes. In IEEE Congress on Evolutionary Computation (CEC). 1374--1381.
[29]
Laurens Van der Maaten and Geoffrey Hinton. 2008. Visualizing data using t-SNE. Journal of machine learning research 9, 11 (2008).
[30]
Leonardo Vanneschi, Yuri Pirola, and Philippe Collard. 2006. A quantitative study of neutrality in GP boolean landscapes. In Genetic and Evolutionary Computation Conference (GECCO). 895--902.
[31]
Leonardo Vanneschi, Yuri Pirola, Giancarlo Mauri, Marco Tomassini, Philippe Collard, and Sébastien Vérel. 2012. A study of the neutrality of Boolean function landscapes in genetic programming. Theoretical Computer Science 425 (2012), 34--57.
[32]
Petar Velickovic, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Liò, and Yoshua Bengio. 2018. Graph Attention Networks. In International Conference on Learning Representations (ICLR).
[33]
Keyulu Xu, Weihua Hu, Jure Leskovec, and Stefanie Jegelka. 2019. How Powerful are Graph Neural Networks?. In International Conference on Learning Representations (ICLR).
[34]
Jiaxuan You, Bowen Liu, Zhitao Ying, Vijay S. Pande, and Jure Leskovec. 2018. Graph Convolutional Policy Network for Goal-Directed Molecular Graph Generation. In Advances in Neural Information Processing Systems (NeurIPS). 6412--6422.
[35]
Hanqing Zeng, Hongkuan Zhou, Ajitesh Srivastava, Rajgopal Kannan, and Viktor K. Prasanna. 2020. GraphSAINT: Graph Sampling Based Inductive Learning Method. In International Conference on Learning Representations (ICLR).
[36]
Guo Zhang, Hao He, and Dina Katabi. 2019. Circuit-GNN: Graph Neural Networks for Distributed Circuit Design. In International Conference on Machine Learning (ICML), Vol. 97. 7364--7373.
[37]
Huan Zhao, Lanning Wei, and Quanming Yao. 2020. Simplifying Architecture Search for Graph Neural Network. In International Conference on Information and Knowledge Management (CIKM) Workshops, Vol. 2699.
[38]
Kaixiong Zhou, Qingquan Song, Xiao Huang, and Xia Hu. 2019. Auto-GNN: Neural Architecture Search of Graph Neural Networks. CoRR abs/1909.03184 (2019). arXiv:1909.03184 http://arxiv.org/abs/1909.03184
[39]
Barret Zoph, Vijay Vasudevan, Jonathon Shlens, and Quoc V. Le. 2018. Learning Transferable Architectures for Scalable Image Recognition. In IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 8697--8710.

Cited By

View all
  • (2024)Universal Local Attractors on GraphsApplied Sciences10.3390/app1411453314:11(4533)Online publication date: 25-May-2024
  • (2024)Minimum-Fuel Low-Thrust Trajectory Optimization via a Direct Adaptive Evolutionary ApproachIEEE Transactions on Aerospace and Electronic Systems10.1109/TAES.2023.333590660:2(1319-1333)Online publication date: Apr-2024
  • (2023)Channel Configuration for Neural Architecture: Insights from the Search SpaceProceedings of the Genetic and Evolutionary Computation Conference10.1145/3583131.3590386(1267-1275)Online publication date: 15-Jul-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '21: Proceedings of the Genetic and Evolutionary Computation Conference
June 2021
1219 pages
ISBN:9781450383509
DOI:10.1145/3449639
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 June 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. fitness landscape analysis
  2. graph neural networks
  3. neural architecture search

Qualifiers

  • Research-article

Funding Sources

Conference

GECCO '21
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)41
  • Downloads (Last 6 weeks)3
Reflects downloads up to 10 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Universal Local Attractors on GraphsApplied Sciences10.3390/app1411453314:11(4533)Online publication date: 25-May-2024
  • (2024)Minimum-Fuel Low-Thrust Trajectory Optimization via a Direct Adaptive Evolutionary ApproachIEEE Transactions on Aerospace and Electronic Systems10.1109/TAES.2023.333590660:2(1319-1333)Online publication date: Apr-2024
  • (2023)Channel Configuration for Neural Architecture: Insights from the Search SpaceProceedings of the Genetic and Evolutionary Computation Conference10.1145/3583131.3590386(1267-1275)Online publication date: 15-Jul-2023
  • (2023)Local Fitness Landscape Exploration Based Genetic AlgorithmsIEEE Access10.1109/ACCESS.2023.323477511(3324-3337)Online publication date: 2023
  • (2023)A data-driven approach to neural architecture search initializationAnnals of Mathematics and Artificial Intelligence10.1007/s10472-022-09823-0Online publication date: 22-Mar-2023
  • (2023)On the Effect of Solution Representation and Neighborhood Definition in AutoML Fitness LandscapesEvolutionary Computation in Combinatorial Optimization10.1007/978-3-031-30035-6_15(227-243)Online publication date: 12-Apr-2023
  • (2022)AutoML Loss LandscapesACM Transactions on Evolutionary Learning and Optimization10.1145/35587742:3(1-30)Online publication date: 22-Nov-2022
  • (2022)A Fitness Landscape Analysis of the LeNet-5 Loss Function2022 IEEE Symposium Series on Computational Intelligence (SSCI)10.1109/SSCI51031.2022.10022277(352-359)Online publication date: 4-Dec-2022
  • (2022)Analysis of Neutrality of AutoML Search Spaces with Local Optima NetworksIntelligent Systems10.1007/978-3-031-21686-2_33(473-487)Online publication date: 28-Nov-2022

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media