Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3491396.3506510acmconferencesArticle/Chapter ViewAbstractPublication PagesiceaConference Proceedingsconference-collections
research-article

A Training-free Genetic Neural Architecture Search

Published: 07 January 2022 Publication History

Abstract

The so-called neural architecture search (NAS) provides an alternative way to construct a "good neural architecture," which would normally outperform hand-made architectures, for solving complex problems without domain knowledge. However, a critical issue for most of the NAS techniques is in that it is computationally very expensive because several complete/partial training processes are involved in evaluating the goodness of a neural architecture during the process of NAS. To mitigate this problem for evaluating a single neural architecture found by the search algorithm of NAS, we present an efficient NAS in this study, called genetic algorithm and noise immunity for neural architecture search without training (GA-NINASWOT). The genetic algorithm (GA) in the proposed algorithm is used to search for high potential neural architectures while a modified scoring method based on the neural architecture search without training (NASWOT) is used to replace the training process of each neural architecture found by the GA for measuring its quality. To evaluate the performance of GA-NINASWOT, we compared it with several state-of-the-art NAS techniques, which include weight-sharing methods, non-weight-sharing methods, and NASWOT. Simulation results show that GA-NINASWOT outperforms all the other state-of-the-art weight-sharing methods and NASWOT compared in this study in terms of the accuracy and computational time. Moreover, GA-NINASWOT gives a result that is comparable to those found by the non-weight-sharing methods while reducing 99% of the search time.

References

[1]
X. He, K. Zhao, and X. Chu, "AutoML: A survey of the state-of-the-art," Knowledge-Based Systems, vol. 212, pp. 1--27, 2020.
[2]
G. Kyriakides and K. Margaritis, "An introduction to neural architecture search for convolutional networks," arXiv:2005.11074, 2020.
[3]
M. Wistuba, A. Rawat, and T. Pedapati, "A survey on neural architecture search," arXiv:1905.01392, 2019.
[4]
J. Mellor, J. Turner, A. Storkey, and E. J. Crowley, "Neural architecture search without training," arXiv:2006.04647, 2020.
[5]
W. Liu, Z. Wang, X. Liu, N. Zeng, Y. Liu, and F. Alsaadi, "A survey of deep neural network architectures and their applications," Neurocomputing, vol. 234, pp. 11--26, 2017.
[6]
T. Elsken, J. H. Metzen, and F. Hutter, "Neural architecture search: A survey," arXiv:1808.05377, 2019.
[7]
B. Zoph and Q. V. Le, "Neural architecture search with reinforcement learning," arXiv:1611.01578, 2017.
[8]
Y. Liu, Y. Sun, B. Xue, M. Zhang, G. G. Yen, and K. C. Tan, "A survey on evolutionary neural architecture search," IEEE Transactions on Neural Networks and Learning Systems, pp. 1--21, 2021.
[9]
E. Real, A. Aggarwal, Y. Huang, and Q. V. Le, "Regularized evolution for image classifier architecture search," in Proceedings of the AAAI Conference on Artificial Intelligence, 2019, pp. 4780--4789.
[10]
L. Xie and A. Yuille, "Genetic CNN," in Proceedings of the IEEE International Conference on Computer Vision, 2017, pp. 1388--1397.
[11]
H. Liu, K. Simonyan, and Y. Yang, "DARTS: Differentiable architecture search," arXiv:1806.09055, 2018.
[12]
C. Liu, B. Zoph, M. Neumann, J. Shlens, W. Hua, L.-J. Li, L. Fei-Fei, A. Yuille, J. Huang, and K. Murphy, "Progressive neural architecture search," in Proceedings of the European conference on Computer Vision, 2018, pp. 19--35.
[13]
D. Zhou, X. Zhou, W. Zhang, C. C. Loy, S. Yi, X. Zhang, and W. Ouyang, "EcoNAS: Finding proxies for economical neural architecture search," in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2020, pp. 11 393-11 401.
[14]
M. S. Abdelfattah, A. Mehrotra, Ł. Dudziak, and N. D. Lane, "Zero-cost proxies for lightweight NAS," in Proceedings of the International Conference on Learning Representations, 2021, pp. 1--17.
[15]
X. Dong and Y. Yang, "NAS-Bench-201: Extending the scope of reproducible neural architecture search," in Proceedings of the International Conference on Learning Representations, 2020, pp. 1--16.
[16]
R. J. Williams, "Simple statistical gradient-following algorithms for connectionist reinforcement learning," Machine Learning, vol. 8, pp. 229--256, 2004.
[17]
S. Falkner, A. Klein, and F. Hutter, "BOHB: Robust and efficient hyperparameter optimization at scale," in Proceedings of the International Conference on Machine Learning, 2018, pp. 1437--1446.
[18]
L. Li and A. Talwalkar, "Random search and reproducibility for neural architecture search," in Proceedings of the Uncertainty in Artificial Intelligence, 2020, pp. 367--377.
[19]
X. Dong and Y. Yang, "Searching for a robust neural architecture in four GPU hours," in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2019, pp. 1761--1770.
[20]
X. Dong and Y. Yang, "One-shot neural architecture search via self-evaluated template network," in Proceedings of the IEEE/CVF International Conference on Computer Vision, 2019, pp. 3680--3689.
[21]
H. Pham, M. Guan, B. Zoph, Q. Le, and J. Dean, "Efficient neural architecture search via parameters sharing," in Proceedings of the International Conference on Machine Learning, 2018, pp. 4095--4104.

Cited By

View all
  • (2025)Systematic review on neural architecture searchArtificial Intelligence Review10.1007/s10462-024-11058-w58:3Online publication date: 6-Jan-2025
  • (2024)THNAS-GA: A Genetic Algorithm for Training-free Hardware-aware Neural Architecture SearchProceedings of the Genetic and Evolutionary Computation Conference10.1145/3638529.3654226(1128-1136)Online publication date: 14-Jul-2024
  • (2024)Zero-Shot Neural Architecture Search: Challenges, Solutions, and OpportunitiesIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2024.339542346:12(7618-7635)Online publication date: Dec-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ACM ICEA '21: Proceedings of the 2021 ACM International Conference on Intelligent Computing and its Emerging Applications
December 2021
241 pages
ISBN:9781450391603
DOI:10.1145/3491396
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 07 January 2022

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Neural architecture search
  2. genetic algorithm
  3. training-free

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • MOST

Conference

ACM ICEA '21
Sponsor:

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)96
  • Downloads (Last 6 weeks)4
Reflects downloads up to 08 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Systematic review on neural architecture searchArtificial Intelligence Review10.1007/s10462-024-11058-w58:3Online publication date: 6-Jan-2025
  • (2024)THNAS-GA: A Genetic Algorithm for Training-free Hardware-aware Neural Architecture SearchProceedings of the Genetic and Evolutionary Computation Conference10.1145/3638529.3654226(1128-1136)Online publication date: 14-Jul-2024
  • (2024)Zero-Shot Neural Architecture Search: Challenges, Solutions, and OpportunitiesIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2024.339542346:12(7618-7635)Online publication date: Dec-2024
  • (2024)A Training-Free Neural Architecture Search Algorithm Based on Search EconomicsIEEE Transactions on Evolutionary Computation10.1109/TEVC.2023.326453328:2(445-459)Online publication date: Apr-2024
  • (2024)A Lightweight Training-Free Method for Neural Architecture Search2024 IEEE Congress on Evolutionary Computation (CEC)10.1109/CEC60901.2024.10611899(1-8)Online publication date: 30-Jun-2024
  • (2024)Neural architecture search for image super-resolution: A review on the emerging state-of-the-artNeurocomputing10.1016/j.neucom.2024.128481(128481)Online publication date: Aug-2024
  • (2024)Training-free neural architecture search: A reviewICT Express10.1016/j.icte.2023.11.00110:1(213-231)Online publication date: Feb-2024
  • (2023)FreeREA: Training-Free Evolution-based Architecture Search2023 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)10.1109/WACV56688.2023.00154(1493-1502)Online publication date: Jan-2023
  • (2023)A Hybrid Filter Pruning Method Based on Linear Region Analysis2023 IEEE 22nd International Conference on Trust, Security and Privacy in Computing and Communications (TrustCom)10.1109/TrustCom60117.2023.00341(2424-2431)Online publication date: 1-Nov-2023
  • (2022)Evolution of activation functions for deep learning-based image classificationProceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3520304.3533949(2113-2121)Online publication date: 9-Jul-2022

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media