Nothing Special   »   [go: up one dir, main page]

skip to main content
survey

Survey on Evolutionary Deep Learning: Principles, Algorithms, Applications, and Open Issues

Published: 15 September 2023 Publication History

Abstract

Over recent years, there has been a rapid development of deep learning (DL) in both industry and academia fields. However, finding the optimal hyperparameters of a DL model often needs high computational cost and human expertise. To mitigate the above issue, evolutionary computation (EC) as a powerful heuristic search approach has shown significant merits in the automated design of DL models, so-called evolutionary deep learning (EDL). This article aims to analyze EDL from the perspective of automated machine learning (AutoML). Specifically, we first illuminate EDL from DL and EC and regard EDL as an optimization problem. According to the DL pipeline, we systematically introduce EDL methods ranging from data preparation, model generation, to model deployment with a new taxonomy (i.e., what and how to evolve/optimize), and focus on the discussions of solution representation and search paradigm in handling the optimization problem by EC. Finally, key applications, open issues, and potentially promising lines of future research are suggested. This survey has reviewed recent developments of EDL and offers insightful guidelines for the development of EDL.

Supplementary Material

3603704.supp (3603704.supp.pdf)
Supplementary material

References

[1]
Hervé Abdi and Lynne J. Williams. 2010. Principal component analysis. Comput. Stat. 2, 4 (2010), 433–459.
[2]
Amr Ahmed, Saad Mohamed Darwish, and Mohamed M. El-Sherbiny. 2019. A novel automatic CNN architecture design approach based on genetic algorithm. In Proceedings of the International Conference on Advanced Intelligent Systems and Informatics.473–482.
[3]
Soha Ahmed, Mengjie Zhang, Lifeng Peng, and Bing Xue. 2014. Multiple feature construction for effective biomarker identification and classification using genetic programming. In Proceedings of the Genetic Evolutionary Computation Conference.249–256.
[4]
Buthainah Al-kazemi and Chilukuri Krishna Mohan. 2002. Training feedforward neural networks using multi-phase particle swarm optimization. In Proceedings of the International Conference on Neural Information Processing. 2615–2619.
[5]
Harith Al-Sahaf, Ying Bi, Qi Chen, Andrew Lensen, Yi Mei, Yanan Sun, Binh Tran, Bing Xue, and Mengjie Zhang. 2019. A survey on evolutionary machine learning. J. R. Soc. N. Z. 49, 2 (2019), 205–228.
[6]
Wissam A. Albukhanajer, Johann A. Briffa, and Yaochu Jin. 2015. Evolutionary multiobjective image feature extraction in the presence of noise. IEEE Trans. Cybern. 45, 9 (2015), 1757–1768.
[7]
Stamatios-Aggelos N. Alexandropoulos and Christos K. Aridas. 2019. Multi-objective evolutionary optimization algorithms for machine learning: A recent survey. Approxim. Optim. 145, 4 (2019), 35–55.
[8]
Ibrahim Aljarah, Hossam Faris, and Seyed Mohammad Mirjalili. 2018. Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Comput. 22, 1 (2018), 1–15.
[9]
Manoj Alwani, Yang Wang, and Vashisht Madhavan. 2022. DECORE: Deep compression with reinforcement learning. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.12349–12359.
[10]
Malek Alzaqebah, Khaoula Briki, Nashat Alrefai, Sami Brini, Sana Jawarneh, Mutasem K. Alsmadi, Rami Mustafa A. Mohammad, Ibrahim ALmarashdeh, Fahad A. Alghamdi, Nahier Aldhafferi et al. 2021. Memory based Cuckoo search algorithm for feature selection of gene expression dataset. Inform. Med. Unlocked 24 (2021), 100572.
[11]
Hayden Andersen, Sean Stevenson, Tuan Ha, Xiaoying Gao, and Bing Xue. 2021. Evolving neural networks for text classification using genetic algorithm-based approaches. In Proceedings of the IEEE Congress on Evolutionary Computation.1241–1248.
[12]
Filipe Assuno, Joao Correia, and Rúben Conceição. 2019. Automatic design of artificial neural networks for gamma-ray detection. IEEE Access 7 (2019), 110531–110540.
[13]
Filipe Assuno, Nuno Lourenço, P. Machado, and Bernardete Ribeiro. 2018. Evolving the topology of large scale deep neural networks. In Proceedings of the European Conference on Genetic Programming. 19–34.
[14]
Filipe Assuno, Nuno Lourenço, Penousal Machado, and Bernardete Ribeiro. 2019. Fast denser: Efficient deep neuroevolution. In Proceedings of the European Conference on Genetic Programming. 197–212.
[15]
Medha Atre, Birendra Jha, and Ashwini Rao. 2021. Distributed deep learning using volunteer computing-like paradigm. In Proceedings of the International Parallel and Distributed Processing Symposium.933–942.
[16]
Shohag Barman and Yung-Keun Kwon. 2020. A neuro-evolution approach to infer a Boolean network from time-series gene expressions. Bioinformatics 36, 2 (2020), i762–i769.
[17]
Amir Behjat and Sharat Chidambaran. 2019. Adaptive Genomic Evolution of Neural network Topologies (AGENT) for state-to-action mapping in autonomous agents. In Proceedings of the International Conference on Robotics and Automation.9638–9644.
[18]
Bir Bhanu and Krzysztof Krawiec. 2002. Coevolutionary construction of features for transformation of representation in machine learning. In Proceedings of the Genetic Evolutionary Computation Conference.249–254.
[19]
Ying Bi, Bing Xue, Pablo Mesejo, Stefano Cagnoni, and Mengjie Zhang. 2023. A survey on evolutionary computation for computer vision and image analysis: Past, present, and future trends. IEEE Trans. Evol. Comput. 27, 1 (2023), 5–25.
[20]
Ying Bi, Bing Xue, and Mengjie Zhang. 2018. An automatic feature extraction approach to image classification using genetic programming. In Proceedings of the International Conference on Applications of Evolutionary Computation.421–438.
[21]
Ying Bi, Bing Xue, and Mengjie Zhang. 2021. Genetic Programming for Image Classification: An Automated Approach to Feature Learning, Vol. 24. Springer Nature.
[22]
Ying Bi, Bing Xue, and Mengjie Zhang. 2022. Genetic programming-based evolutionary deep learning for data-efficient image classification. IEEE Trans. Evol. Comput. (2022). Early Access. DOI:DOI:
[23]
Ying Bi, Bing Xue, and Mengjie Zhang. 2023. Instance selection-based surrogate-assisted genetic programming for feature learning in image classification. IEEE Trans. Cybern. 53, 2 (2023), 1118–1132.
[24]
Alberto Cano, Sebastián Ventura, and Krzysztof J. Cios. 2017. Multi-objective genetic programming for feature extraction and data visualization. Soft Comput. 21, 8 (2017), 2069–2089.
[25]
Mauro Castelli, Luca Manzoni, and Leonardo Vanneschi. 2011. Multi objective genetic programming for feature construction in classification problems. In Proceedings of the International Conference on Learning and Intelligent Optimization.503–506.
[26]
Zheng-Yi Chai, ChuanHua Yang, and Ya-Lun Li. 2022. Communication efficiency optimization in federated learning based on multi-objective evolutionary algorithm. Evol. Intell. 16, 11 (2022), 1033–1044.
[27]
Rohitash Chandra. 2015. Competition and collaboration in cooperative coevolution of Elman recurrent neural networks for time-series prediction. IEEE Trans. Neural Netw. Learn. Syst. 26, 12 (2015), 3123–3136.
[28]
Rohitash Chandra and Mengjie Zhang. 2012. Cooperative coevolution of Elman recurrent neural networks for chaotic time series prediction. Neurocomputing 86 (2012), 116–123.
[29]
Qi Chen, Bing Xue, and Mengjie Zhang. 2015. Generalisation and domain adaptation in GP with gradient descent for symbolic regression. In Proceedings of the IEEE Congress on Evolutionary Computation.1137–1144.
[30]
Xiangru Chen, Yanan Sun, Mengjie Zhang, and Dezhong Peng. 2020. Evolving deep convolutional variational autoencoders for image classification. IEEE Trans. Evol. Comput. 25, 5 (2020), 815–829.
[31]
Yukang Chen, Gaofeng Meng, Qian Zhang, Shiming Xiang, and Chang Huang. 2019. RENAS: Reinforced Evolutionary Neural Architecture Search. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.4787–4796.
[32]
Fan Cheng, Feixiang Chu, Yi Xu, and Lei Zhang. 2021. A steering-matrix-based multiobjective evolutionary algorithm for high-dimensional feature selection. IEEE Trans. Cybern. 52, 9 (2021), 9695–9708.
[33]
Yu Cheng, Duo Wang, Pan Zhou, and Tao Zhang. 2017. A survey of model compression and acceleration for deep neural networks. arXiv preprint arXiv:1710.09282 (2017).
[34]
Zouhair Chiba, Noreddine Abghour, Khalid Moussaid, Amina El Omri, and Mohamed Rida. 2019. Intelligent approach to build a deep neural network based IDS for cloud environment using combination of machine learning algorithms. Comput. Secur. 86 (2019), 291–317.
[35]
Tejalal Choudhary, Vipul Mishra, Anurag Goswami, and Jagannathan Sarangapani. 2020. A comprehensive survey on model compression and acceleration. Artif. Intell. Rev. 53, 7 (2020), 5113–5155.
[36]
Patryk Chrabaszcz, Ilya Loshchilov, and Frank Hutter. 2017. A downsampled variant of ImageNet as an alternative to the CIFAR datasets. arXiv preprint arXiv:1707.08819 (2017).
[37]
Xiangxiang Chu, Bo Zhang, Ruijun Xu, and Hailong Ma. 2020. Multi-objective reinforced evolution in mobile neural architecture search. In Proceedings of the European Conference on Computer Vision.99–113.
[38]
Edoardo Conti, Vashisht Madhavan, Felipe Petroski Such, Joel Lehman, Kenneth O. Stanley, and Jeff Clune. 2018. Improving exploration in evolution strategies for deep reinforcement learning via a population of novelty-seeking agents. In Proceedings of the International Conference on Advances in Neural Information Processing Systems. 5032–5043.
[39]
Joao Correia, Tiago Martins, and Penousal Machado. 2019. Evolutionary data augmentation in deep face detection. In Proceedings of the Genetic Evolutionary Computation Conference.163–164.
[40]
Xiaodong Cui, Wei Zhang, Zoltán Tüske, and Michael Picheny. 2018. Evolutionary stochastic gradient descent for optimization of deep neural networks. Proc. Adv. Neural Inf. Process. Syst. 31 (2018), 6051–6061.
[41]
Sérgio Francisco Da Silva and João do E. S. Batista Neto. 2011. Improving the ranking quality of medical image retrieval using a genetic feature selection method. Decis. Support Syst. 51, 4 (2011), 810–820.
[42]
Binay Dahal and Justin Zhijun Zhan. 2020. Effective mutation and recombination for evolving convolutional networks. In Proceedings of the International Conference on Advances in Neural Information Processing Systems.1–6.
[43]
N. Dalal and B. Triggs. 2005. Histograms of oriented gradients for human detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 886–893.
[44]
David B. D’Ambrosio and Kenneth O. Stanley. 2007. A novel generative encoding for exploiting neural network sensor and output geometry. In Proceedings of the Genetic Evolutionary Computation Conference.974–981.
[45]
Ashraf Darwish, Aboul Ella Hassanien, and Swagatam Das. 2020. A survey of swarm and evolutionary computing approaches for deep learning. Artif. Intell. Rev. 53, 3 (2020), 1767–1812.
[46]
Kalyanmoy Deb, Samir Agrawal, Amrit Pratap, and T. Meyarivan. 2002. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6, 2 (2002), 182–197.
[47]
Thomas Dowdell and Hongyu Zhang. 2020. Language modelling for source code with transformer-XL. arXiv preprint arXiv:2007.15813 (2020).
[48]
Ke-Lin Du and M. N. S. Swamy. 2016. Estimation of Distribution Algorithms. Springer International Publishing, Cham, 105–119.
[49]
Thomas Elsken, Jan-Hendrik Metzen, and Frank Hutter. 2017. Simple and efficient architecture search for convolutional neural networks. arXiv preprint arXiv:1711.04528 (2017).
[50]
Thomas Elsken, Jan Hendrik Metzen, and Frank Hutter. 2019. Efficient multi-objective neural architecture search via Lamarckian evolution. In Proceedings of the International Conference on Learning Representations. Retrieved from https://arxiv.org/abs/1804.09081.
[51]
Raquel Espinosa, Fernando Jiménez, and José Palma. 2023. Surrogate-assisted and filter-based multiobjective evolutionary feature selection for deep learning. IEEE Trans. Neural Netw. Learn. Syst. (2023). Early Access. DOI:DOI:
[52]
Pablo A. Estévez and Rodrigo E. Caballero. 1998. A niching genetic algorithm for selecting features for neural network classifiers. In Proceedings of the International Conference on Artificial Neural Networks.311–316.
[53]
Benjamin Evans, Harith Al-Sahaf, Bing Xue, and Mengjie Zhang. 2018. Evolutionary deep learning: A genetic programming approach to image classification. In Proceedings of the IEEE Congress on Evolutionary Computation.1538–1545.
[54]
Tresna Maulana Fahrudin, Iwan Syarif, and Ali Ridho Barakbah. 2016. Ant colony algorithm for feature selection on microarray datasets. In Proceedings of the International Electronics Symposium.351–356.
[55]
Qinglan Fan, Ying Bi, Bing Xue, and Mengjie Zhang. 2022. A global and local surrogate-assisted genetic programming approach to image classification. IEEE Trans. Evol. Comput. (2022). Early Access. DOI:DOI:
[56]
Zhun Fan, Jiahong Wei, Guijie Zhu, Jiajie Mo, and Wenji Li. 2020. Evolutionary neural architecture search for retinal vessel segmentation. arXiv preprint arXiv:2001.06678 (2020).
[57]
Christopher Fogelberg and Mengjie Zhang. 2005. Linear genetic programming for multi-class object classification. In Proceedings of the Australian Joint Conference on Artificial Intelligence.369–379.
[58]
Luigi Fortuna and Mattia Frasca. 2021. Singular value decomposition. Optim. Robot. Control 14, 2 (2021), 51–58.
[59]
Luc Frachon, Wei Pang, and George M. Coghill. 2019. ImmuNeCS: Neural committee search by an artificial immune system. arXiv preprint arXiv:1911.07729 (2019).
[60]
Alex A. Freitas. 2003. A survey of evolutionary algorithms for data mining and knowledge discovery. In Advances in Evolutionary Computing.Springer, 819–845.
[61]
Ying Fu, Min Gong, Guang Yang, Hong Wei, and Jiliu Zhou. 2021. Evolutionary GAN–based data augmentation for cardiac magnetic resonance image. Comput., Mater. Contin. 1, 68 (2021), 1359–1374.
[62]
Saya Fujino, Taichi Hatanaka, Naoki Mori, and Keinosuke Matsumoto. 2019. Evolutionary deep learning based on deep convolutional neural network for anime storyboard recognition. Neurocomputing 338 (2019), 393–398.
[63]
Saya Fujino, Naoki Mori, and Keinosuke Matsumoto. 2017. Deep convolutional networks for human sketches by means of the evolutionary deep learning. In Proceedings of the International Conference on Soft Computing and Intelligent Systems.1–5.
[64]
Kosaku Fujita, Masayuki Kobayashi, and Tomoharu Nagao. 2018. Data augmentation using evolutionary image processing. In Proceedings of the International Conference on Digital Image Computing: Techniques and Applications.1–6.
[65]
Edgar Galván and Peter Mooney. 2021. Neuroevolution in deep neural networks: Current trends and future challenges. IEEE Trans. 2, 6 (2021), 476–493.
[66]
David García, Antonio González Muñoz, and Raúl Pérez. 2011. A two-step approach of feature construction for a genetic learning algorithm. In Proceedings of the International Conference on Fuzzy Systems.1255–1262.
[67]
Wolfgang Golubski and Thomas Feuring. 1999. Evolving neural network structures by means of genetic programming. In Proceedings of the European Conference on Genetic Programming. 211–220.
[68]
Jianping Gou, Baosheng Yu, Stephen J. Maybank, and Dacheng Tao. 2021. Knowledge distillation: A survey. Int. J. Comput. Vis. 129, 6 (2021), 1789–1819.
[69]
Zhenhua Guo, Lei Zhang, and David Zhang. 2010. A completed modeling of local binary pattern operator for texture classification. IEEE Trans. Image Process. 19, 6 (2010), 1657–1663.
[70]
Farshid Hajati, Caro Lucas, and Yongsheng Gao. 2010. Face localization using an effective co-evolutionary genetic algorithm. In Proceedings of the International Conference on Digital Image Computing: Techniques and Applications.522–527.
[71]
Marwa Hammami, Slim Bechikh, and Chih-Cheng Hung. 2018. A multi-objective hybrid filter-wrapper evolutionary approach for feature construction on high-dimensional data. In Proceedings of the IEEE Congress on Evolutionary Computation.1–8.
[72]
Sang-Jun Han and Sung-Bae Cho. 2006. Evolutionary neural networks for anomaly detection based on the behavior of a program. IEEE Trans. Syst. Man Cybern. 36, 3 (2006), 559–570.
[73]
Emrah Hancer, Bing Xue, Mengjie Zhang, and Dervis Karaboga. 2015. A multi-objective artificial bee colony approach to feature selection using fuzzy mutual information. In Proceedings of the IEEE Congress on Evolutionary Computation.2420–2427.
[74]
Xin He, Kaiyong Zhao, and Xiaowen Chu. 2021. AutoML: A survey of the state-of-the-art. Knowl.-based Syst. 212 (2021), 106622.
[75]
Yihui He, Xiangyu Zhang, and Jian Sun. 2017. Channel pruning for accelerating very deep neural networks. In Proceedings of the IEEE International Conference on Computer Vision.1398–1406.
[76]
Jin-Hyuk Hong and Sung-Bae Cho. 2006. Efficient huge-scale feature selection with speciated genetic algorithm. Pattern Recognit. Lett. 27, 2 (2006), 143–150.
[77]
Mohamed Hosni, Ginés García-Mateos, and Juan Carrillo-de Gea. 2020. A mapping study of ensemble classification methods in lung cancer decision support systems. Med. Biol. Eng. Comput. 58, 10 (2020), 2177–2193.
[78]
Yingtung Hsiao. 2004. Multiobjective evolution programming method for feeder reconfiguration. IEEE Trans. Power. Syst. 19, 1 (2004), 594–599.
[79]
Yen-Chang Hsu, Ting Hua, Sungen Chang, Qian Lou, Yilin Shen, and Hongxia Jin. 2021. Language model compression with weighted low-rank factorization. In Proceedings of the International Conference on Learning Representations. Retrieved from https://arxiv.org/abs/2207.00112.
[80]
Bin Hu, Tianming Zhao, Yucheng Xie, Yan Wang, and Xiaonan Guo. 2021. MIXP: Efficient deep neural networks pruning for further FLOPs compression via neuron bond. In Proceedings of the International Joint Conference on Neural Networks.1–8.
[81]
Junhao Huang, Weize Sun, and Lei Huang. 2020. Deep neural networks compression learning based on multiobjective evolutionary algorithms. Neurocomputing 378 (2020), 260–269.
[82]
Renke Huang, Wei Gao, Rui Fan, and Qiuhua Huang. 2022. A guided evolutionary strategy based Static Var Compensator control approach for inter-area oscillation damping. IEEE Trans. Industr. Inform. 19, 3 (2022), 2596–2607.
[83]
Earnest Paul Ijjina and Krishna Mohan Chalavadi. 2016. Human action recognition using genetic algorithms and convolutional neural networks. Pattern Recognit. 59 (2016), 199–212.
[84]
William Irwin-Harris, Yanan Sun, Bing Xue, and Mengjie Zhang. 2019. A graph-based encoding for evolutionary convolutional neural network architecture design. In Proceedings of the IEEE Congress on Evolutionary Computation.546–553.
[85]
Alan Julian Izenman. 2013. Linear discriminant analysis. In Modern Multivariate Statistical Techniques. Springer, 237–280.
[86]
Yesmina Jaâfra, Jean Luc Laurent, Aline Deruyver, and Mohamed Saber Naceur. 2019. Reinforcement learning for neural architecture search: A review. Image Vis. Comput. 89 (2019), 57–66.
[87]
Yanxiang Jiang, Xuan Chen, Fu-Chun Zheng, Dusit Niyato, and Xiaohu You. 2021. Brain storm optimization-based edge caching in fog radio access networks. IEEE Trans. Vehic. Technol. 70, 2 (2021), 1807–1820.
[88]
Xiaoqi Jiao, Huating Chang, Yichun Yin, Lifeng Shang, Xin Jiang, Xiao Chen, Linlin Li, Fang Wang, and Qun Liu. 2020. Improving task-agnostic BERT distillation with layer mapping search. Neurocomputing 461 (2020), 194–203.
[89]
Haifeng Jin, Qingquan Song, and Xia Hu. 2018. Auto-Keras: Efficient neural architecture search with network morphism. arXiv preprint arXiv:1806.10282 (2018).
[90]
Haifeng Jin, Qingquan Song, and Xia Hu. 2019. Auto-Keras: An efficient neural architecture search system. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.1946–1956.
[91]
Yaochu Jin. 2006. Multi-Objective Machine Learning. Springer Science.
[92]
Dervis Karaboga, Bahriye Akay, and Celal Öztürk. 2007. Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks. In Proceedings of the International Conference on Modeling Decisions for Artificial Intelligence.318–329.
[93]
Asha Gowda Karegowda and A. S. Manjunath. 2011. Application of genetic algorithm optimized neural network connection weights for medical diagnosis of PIMA Indians diabetes. Int. J. Soft Comput. 2, 2 (2011), 15–23.
[94]
Shauharda Khadka and Kagan Tumer. 2018. Evolution-guided policy gradient in reinforcement learning. In Proceedings of the International Conference on Advances in Neural Information Processing Systems.1196–1208.
[95]
Rami N. Khushaba, Ahmed Al-Ani, Akram AlSukker, and Adel Al-Jumaily. 2008. A combined ant colony and differential evolution feature selection algorithm. In Proceedings of the International Conference on Ant Colony Optimization and Swarm Intelligence.1–12.
[96]
Hiroaki Kitano. 1990. Designing neural networks using genetic algorithms with graph generation system. Complex Syst. 4, 4 (1990), 225–238.
[97]
Madhusudhan Kongovi, Juan Carlos Guzman, and Venu Dasigi. 2002. Text categorization: An experiment using phrases. In Proceedings of the European Conference on Information Retrieval. Springer, 213–228.
[98]
Manabu Kotani and Daisuke Kato. 2004. Feature extraction using coevolutionary genetic programming. In Proceedings of the IEEE Congress on Evolutionary Computation.614–619.
[99]
Jan Koutník, Faustino J. Gomez, and Jürgen Schmidhuber. 2010. Evolving neural networks in compressed weight space. In Proceedings of the Genetic Evolutionary Computation Conference.619–626.
[100]
Jan Koutník, Jürgen Schmidhuber, and Faustino J. Gomez. 2014. Evolving deep unsupervised convolutional networks for vision-based reinforcement learning. In Proceedings of the Genetic Evolutionary Computation Conference.541–548.
[101]
Krzysztof Krawiec. 2002. Genetic programming-based construction of features for machine learning and knowledge discovery tasks. Genet. Program. Evolvable Mach. 3, 4 (2002), 329–343.
[102]
Alex Krizhevsky, Ilya Sutskever, and Geoffrey E. Hinton. 2012. ImageNet classification with deep convolutional neural networks. In Proceedings of the International Conference on Advances in Neural Information Processing Systems. 1097–1105.
[103]
Arkadiusz Kwasigroch, Michał Grochowski, and Mateusz Mikolajczyk. 2019. Deep neural network architecture search using network morphism. In Proceedings of the International Conference on Methods and Models in Automation and Robotics.30–35.
[104]
Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. 2015. Deep learning. Nature 521, 7553 (2015), 436–444.
[105]
Yann LeCun, Bernhard Boser, John S. Denker, Donnie Henderson, Richard E. Howard, and Wayne Hubbard. 1989. Backpropagation applied to handwritten zip code recognition. Neural Comput. 1, 4 (1989), 541–551.
[106]
Jun-qing Li, Zheng-min Liu, Chengdong Li, and Zhi-xin Zheng. 2021. Improved artificial immune system algorithm for type-2 fuzzy flexible job shop scheduling problem. IEEE Trans. Fuzzy Syst. 29, 11 (2021), 3234–3248.
[107]
Nan Li, Lianbo Ma, Tiejun Xing, Guo Yu, Chen Wang, Yingyou Wen, Shi Cheng, and Shangce Gao. 2023. Automatic design of machine learning via evolutionary computation: A survey. Appl. Soft Comput. 143 (2023), 110412.
[108]
Qing Li, Wei Zhang, Lin Zhao, Xia Wu, and Tianming Liu. 2022. Evolutional neural architecture search for optimization of spatiotemporal brain network decomposition. IEEE. Trans. Biomed. Eng. 69, 2 (2022), 624–634.
[109]
Youru Li, Zhenfeng Zhu, Deqiang Kong, Hua Han, and Yao Zhao. 2019. EA-LSTM: Evolutionary attention-based LSTM for time series prediction. Knowl.-based Syst. 181 (2019), 104785.
[110]
Jason Zhi Liang, Elliot Meyerson, Babak Hodjat, Daniel Fink, Karl Mutch, and Risto Miikkulainen. 2019. Evolutionary neural AutoML for deep learning. In Proceedings of the Genetic Evolutionary Computation Conference.401–409.
[111]
Tailin Liang, John Glossner, Lei Wang, Shaobo Shi, and Xiaotong Zhang. 2021. Pruning and quantization for deep neural network acceleration: A survey. Neurocomputing 461 (2021), 370–403.
[112]
Chenxi Liu, Barret Zoph, Maxim Neumann, Jonathon Shlens, Wei Hua, Li-Jia Li, Li Fei-Fei, Alan Yuille, Jonathan Huang, and Kevin Murphy. 2018. Progressive neural architecture search. In Proceedings of the European Conference on Computer Vision.19–34.
[113]
Chia-Hsiang Liu, Yu-Shin Han, Yuan-Yao Sung, Yi Lee, Hung-Yueh Chiang, and Kai-Chiang Wu. 2021. FOX-NAS: Fast, on-device and explainable neural architecture search. In Proceedings of the IEEE International Conference on Computer Vision.789–797.
[114]
Hanxiao Liu, Karen Simonyan, and Yiming Yang. 2018. DARTS: Differentiable architecture search. In Proceedings of the International Conference on Learning Representations.https://arxiv.org/abs/1806.09055.
[115]
Peng Liu, Mohammad D. El Basha, Yangjunyi Li, Yao Xiao, Pina C. Sanelli, and Ruogu Fang. 2019. Deep evolutionary networks with expedited genetic algorithms for medical image denoising. Med. Image Anal. 54 (2019), 306–315.
[116]
Sicong Liu and Bin Guo. 2021. AdaSpring: Context-adaptive and runtime-evolutionary deep model compression for mobile applications. Proc. ACM Interact., Mobile, Wearable Ubiquitous Tech., Vol. 5. ACM, 1–22.
[117]
Shulei Liu, Handing Wang, Wei Peng, and Wen Yao. 2022. A surrogate-assisted evolutionary feature selection algorithm with parallel random grouping for high-dimensional classification. IEEE Trans. Evol. Comput. 26, 5 (2022), 1087–1101. DOI:DOI:
[118]
Yuqiao Liu, Yanan Sun, Bing Xue, Mengjie Zhang, Gary G. Yen, and Kay Chen Tan. 2021. A survey on evolutionary neural architecture search. IEEE Trans. Neural Netw. Learn. Syst. 34, 2 (2021), 550–570.
[119]
Zechun Liu, Haoyuan Mu, Xiangyu Zhang, Zichao Guo, Xin Yang, K. Cheng, and Jian Sun. 2019. MetaPruning: Meta learning for automatic neural network channel pruning. In Proceedings of the IEEE International Conference on Computer Vision.3295–3304.
[120]
Eugenio Lomurno, Stefano Samele, Matteo Matteucci, and Danilo Ardagna. 2021. Pareto-optimal progressive neural architecture search. In Proceedings of the Genetic Evolutionary Computation Conference.1726–1734.
[121]
Pablo Ribalta Lorenzo and Jakub Nalepa. 2018. Memetic evolution of deep neural networks. In Proceedings of the Genetic Evolutionary Computation Conference.505–512.
[122]
Pablo Ribalta Lorenzo, Jakub Nalepa, Luciano Sánchez Ramos, and José Ranilla. 2017. Hyper-parameter selection in deep neural networks using parallel particle swarm optimization. In Proceedings of the Genetic Evolutionary Computation Conference.1864–1871.
[123]
Zhichao Lu, Gautam Sreekumar, Erik Goodman, Wolfgang Banzhaf, Kalyanmoy Deb, and Vishnu Naresh Boddeti. 2021. Neural architecture transfer. IEEE IEEE Trans. Pattern Anal. Mach. Intell. 43, 9 (2021), 2971–2989.
[124]
Zhichao Lu, Ian Whalen, Vishnu Naresh Boddeti, Yashesh D. Dhebar, and Kalyanmoy Deb. 2019. NSGA-Net: Neural architecture search using multi-objective genetic algorithm. In Proceedings of the Genetic Evolutionary Computation Conference.419–427.
[125]
Lianbo Ma, Min Huang, Shengxiang Yang, Rui Wang, and Xingwei Wang. 2022. An adaptive localized decision variable analysis approach to large-scale multiobjective and many-objective optimization. IEEE Trans. Cybern. 52, 7 (2022), 6684–6696.
[126]
Lianbo Ma, Nan Li, Guo Yu, Xiaoyu Geng, Min Huang, and Xingwei Wang. 2021. How to simplify search: Classification-wise pareto evolution for one-shot neural architecture search. arXiv preprint arXiv:2109.07582 (2021).
[127]
Wenping Ma, Xiaobo Zhou, Hao Zhu, Longwei Li, and Licheng Jiao. 2021. A two-stage hybrid ant colony optimization for high-dimensional feature selection. Pattern Recognit. 116 (2021), 107933.
[128]
Stefano Mauceri, James Sweeney, Miguel Nicolau, and James McDermott. 2021. Feature extraction by grammatical evolution for one-class time series classification. Genet. Program. Evolvable Mach. 22, 3 (2021), 267–295.
[129]
Kaitav Nayankumar Mehta, Ziad Kobti, Kathryn A. Pfaff, and Susan Fox. 2019. Data augmentation using CA evolved GANs. IEEE Symp. Comput. Commun. (2019), 1087–1092.
[130]
Silvan Mertes, Alice Baird, Dominik Schiller, Björn W. Schuller, and Elisabeth André. 2020. An evolutionary-based generative approach for audio data augmentation. In Proceedings of the IEEE International Conference on Multimedia and Signal Processing.1–6.
[131]
Erfan Miahi, Seyed Abolghasem Mirroshandel, and Alexis Nasr. 2022. Genetic neural architecture search for automatic assessment of human sperm images. Expert Syst. Appl. 188 (2022), 115937.
[132]
Seyedali Mirjalili, Hossam Faris, and Ibrahim Aljarah. 2019. Evolutionary Machine Learning Techniques. Springer.
[133]
Kamlesh Mistry, Li Zhang, Siew Chin Neoh, Chee Peng Lim, and Ben Fielding. 2016. A micro-GA embedded PSO feature selection approach to intelligent facial emotion recognition. IEEE Trans. Cybern. 47, 6 (2016), 1496–1509.
[134]
Hyunho Mo, Leonardo Lucio Custode, and Giovanni Iacca. 2021. Evolutionary neural architecture search for remaining useful life prediction. Appl. Soft Comput. 108 (2021), 107474.
[135]
David J. Montana and Lawrence Davis. 1989. Training feedforward neural networks using genetic algorithms. In Proceedings of the International Joint Conference on Artificial Intelligence. 762–767.
[136]
Hiroshi Motoda and Huan Liu. 2002. Feature selection extraction and construction. Commun. IICM 5, 2 (2002), 67–72.
[137]
Alhassan G. Mumuni and Fuseini Mumuni. 2022. Data augmentation: A comprehensive survey of modern approaches. Array 16 (2022), 100258.
[138]
Durga Prasad Muni, Nikhil R. Pal, and Jyotirmay Das. 2006. Genetic programming for simultaneous feature selection and classifier design. IEEE Trans. Syst. Man. Cybern. 36, 1 (2006), 106–117.
[139]
Mehdi Neshat, Meysam Majidi Nezhad, Ehsan Abbasnejad, Lina Bertling Tjernberg, Davide Astiaso Garcia, Bradley Alexander, and Markus Wagner. 2020. An evolutionary deep learning method for short-term wind speed prediction: A case study of the Lillgrund offshore wind farm. arXiv preprint arXiv:abs/2002.09106 (2020).
[140]
Kourosh Neshatian, Mengjie Zhang, and Peter Andreae. 2012. A filter approach to multiple feature construction for symbolic learning classifiers using genetic programming. IEEE Trans. Evol. Comput. 16, 5 (2012), 645–661.
[141]
Kourosh Neshatian, Mengjie Zhang, and Mark Johnston. 2007. Feature construction and dimension reduction using genetic programming. In Proceedings of the Australian Conference on Artificial Intelligence.242–253.
[142]
Bach Hoai Nguyen, Bing Xue, and Mengjie Zhang. 2022. A constrained competitive swarm optimiser with an SVM-based surrogate model for feature selection. IEEE Trans. Evol. Comput. (2022). Early Access. DOI:DOI:
[143]
Hoai Bach Nguyen, Bing Xue, and Peter Andreae. 2017. Surrogate-model based particle swarm optimisation with local search for feature selection in classification. In Proceedings of the European Conference on Genetic Programming. Springer, 487–505.
[144]
Hoai Bach Nguyen, Bing Xue, Ivy Liu, and Mengjie Zhang. 2014. PSO and statistical clustering for feature selection: A new representation. In Proceedings of the Asia-Pacific Conference on Simulated Evolutionary Learning.569–581.
[145]
Noel M. O’Boyle and David S. Palmer. 2008. Simultaneous feature selection and parameter optimisation using an artificial ant colony: Case study of melting point prediction. Chem. Cent. J. 2, 1 (2008), 1–15.
[146]
Damien O’Neill, Bing Xue, and Mengjie Zhang. 2018. Co-evolution of novel tree-like ANNs and activation functions: An observational study. In Proceedings of the Australian Conference on Artificial Intelligence.616–629.
[147]
Tatt Hee Oong and Nor Ashidi Mat Isa. 2011. Adaptive evolutionary artificial neural networks for pattern classification. IEEE Trans. Neural Netw. Syst. 22, 11 (2011), 1823–1836.
[148]
Patxi Ortego, Alberto Diez-Olivan, Javier Del Ser, and Fernando Veiga. 2020. Evolutionary LSTM-FCN networks for pattern classification in industrial processes. Swarm Evol. Comput. 54 (2020), 100650.
[149]
Bo Peng, Shuting Wan, Ying Bi, Bing Xue, and Mengjie Zhang. 2021. Automatic feature extraction and construction using genetic programming for rotating machinery fault diagnosis. IEEE Trans. Cybern. 51, 10 (2021), 4909–4923.
[150]
Yiming Peng, Gang Chen, Harman Singh, and Mengjie Zhang. 2018. NEAT for large-scale reinforcement learning through evolutionary feature learning and policy gradient search. In Proceedings of the Genetic Evolutionary Computation Conference.490–497.
[151]
Sofia Pereira, João Correia, and Penousal Machado. 2022. Evolving data augmentation strategies. In Proceedings of the International Conference on Applications of Evolutionary Computing.337–351.
[152]
Hai T. Phan, Zechun Liu, Dang The Huynh, Marios Savvides, Kwang-Ting Cheng, and Zhiqiang Shen. 2020. Binarizing MobileNet via evolution-based searching. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.13417–13426.
[153]
Antonio Polino, Razvan Pascanu, and Dan Alistarh. 2018. Model compression via distillation and quantization. In Proceedings of the International Conference on Learning Representations.https://arxiv.org/abs/1802.05668.
[154]
Elad Rapaport, Oren Shriki, and Rami Puzis. 2019. EEGNAS: Neural architecture search for electroencephalography data analysis and decoding. In Proceedings of the International Joint Conference on Artificial Intelligence.3–20.
[155]
A. N. M. Bazlur Rashid, Mohiuddin Ahmed, Leslie F. Sikos, and Paul Haskell-Dowland. 2020. Cooperative co-evolution for feature selection in big data with random feature grouping. J. Big Data 7, 1 (2020), 1–42.
[156]
Aditya Rawal and Risto Miikkulainen. 2018. From nodes to networks: Evolving recurrent neural networks. arXiv preprint arXiv:1803.04439 (2018).
[157]
Esteban Real, Sherry Moore, Andrew Selle, Saurabh Saxena, Yutaka Leon Suematsu, Jie Tan, Quoc V. Le, and Alexey Kurakin. 2017. Large-scale evolution of image classifiers. In Proceedings of the International Conference on Machine Learning.2902–2911.
[158]
Mohammad Saleh Refahi, A. Mir, and Jalal A. Nasiri. 2020. A novel fusion based on the evolutionary features for protein fold recognition using support vector machines. Sci. Rep. 10, 1 (2020), 1–13.
[159]
Pengzhen Ren, Yun Xiao, Xiaojun Chang, Po-Yao Huang, and Zhihui Li. 2021. A comprehensive survey of neural architecture search: Challenges and solutions. ACM Comput. Surv. 54, 4 (2021), 1–34.
[160]
Mark E. Roberts and Ela Claridge. 2005. A multistage approach to cooperatively coevolving feature construction and object detection. In Proceedings of the International Conference on Applications of Evolutionary Computing.396–406.
[161]
Shahin Rostami and Ferrante Neri. 2016. Covariance matrix adaptation Pareto archived evolution strategy with hypervolume-sorted adaptive grid algorithm. Integr. Comput. Aided. Eng. 23, 4 (2016), 313–329.
[162]
David E. Rumelhart, Geoffrey E. Hinton, and Ronald J. Williams. 1985. Learning Internal Representations by Error Propagation. Technical Report. California University San Diego La Jolla Institute for Cognitive Science.
[163]
David E. Rumelhart, Geoffrey E. Hinton, and Ronald J. Williams. 1986. Learning representations by back-propagating errors. Nature 323, 6088 (1986), 533–536.
[164]
Santanu Santra, Jun-Wei Hsieh, and Chi-Fang Lin. 2021. Gradient descent effects on differential neural architecture search: A survey. IEEE Access 9 (2021), 89602–89618.
[165]
Dolly Sapra and Andy D. Pimentel. 2020. Constrained evolutionary piecemeal training to design convolutional neural networks. In Proceedings of the International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems.709–721.
[166]
Christoph Schorn, Thomas Elsken, Sebastian Vogel, and Armin Runge. 2020. Automated design of error-resilient and hardware-efficient deep neural networks. Neural. Comput. Appl. 32, 24 (2020), 18327–18345.
[167]
Leila Shila Shafti and E. Islas Pérez. 2008. Data reduction by genetic algorithms and non-algebraic feature construction: A case study. In Proceedings of the International Conference on Hybrid Intelligent Systems.573–578.
[168]
Pratistha Shakya, Eamonn Kennedy, Christopher Rose, and Jacob K. Rotein. 2021. High-dimensional time series feature extraction for low-cost machine olfaction. IEEE Sens. J. 21, 3 (2021), 2495–2504.
[169]
Mingzhu Shen, Kai Han, Chunjing Xu, and Yunhe Wang. 2019. Searching for accurate binary neural architectures. In Proceedings of the IEEE International Conference on Computer Vision.2041–2044.
[170]
Christopher Smith and Yaochu Jin. 2014. Evolutionary multi-objective generation of recurrent neural network ensembles for time series prediction. Neurocomputing 143 (2014), 302–311.
[171]
Krzysztof Socha and Christian Blum. 2007. An ant colony optimization algorithm for continuous optimization: Application to feed-forward neural network training. Neural. Comput. Appl. 16, 3 (2007), 235–247.
[172]
Kenneth O. Stanley, Jeff Clune, Joel Lehman, and Risto Miikkulainen. 2019. Designing neural networks through neuroevolution. Nat. Mach. Intell. 1, 1 (2019), 24–35.
[173]
Kenneth O. Stanley and Risto Miikkulainen. 2002. Evolving neural networks through augmenting topologies. Evol. Comput. 10, 2 (2002), 99–127.
[174]
Rob Stewart, Andrew Nowlan, Pascal Bacchus, Quentin Ducasse, and Ekaterina Komendantskaya. 2021. Optimising hardware accelerated neural networks with quantisation and a knowledge distillation evolutionary algorithm. Electronics 10, 4 (2021).
[175]
Yanan Sun, Xian Sun, Yuhan Fang, Gary G. Yen, and Yuqiao Liu. 2021. A novel training protocol for performance predictors of evolutionary neural architecture search algorithms. IEEE Trans. Evol. Comput. 25, 3 (2021), 524–536.
[176]
Yanan Sun, Handing Wang, Bing Xue, Yaochu Jin, Gary G. Yen, and Mengjie Zhang. 2019. Surrogate-assisted evolutionary deep learning using an end-to-end random forest-based performance predictor. IEEE Trans. Evol. Comput. 24, 2 (2019), 350–364.
[177]
Yanan Sun, Bing Xue, Mengjie Zhang, and Gary G. Yen. 2019. Completely automated CNN architecture design based on blocks. IEEE Trans. Neural Netw. Learn. Syst. 31, 4 (2019), 1242–1254.
[178]
Yanan Sun, Bing Xue, Mengjie Zhang, and Gary G. Yen. 2019. Evolving deep convolutional neural networks for image classification. IEEE Trans. Evol. Comput. 24, 2 (2019), 394–407.
[179]
Yanan Sun, Bing Xue, Mengjie Zhang, and Gary G. Yen. 2019. A particle swarm optimization-based flexible convolutional autoencoder for image classification. IEEE Trans. Neural Netw. Learn. Syst. 30, 8 (2019), 2295–2309.
[180]
Yanan Sun, Bing Xue, Mengjie Zhang, Gary G. Yen, and Jiancheng Lv. 2020. Automatically designing CNN architectures using the genetic algorithm for image classification. IEEE Trans. Cybern. 50, 9 (2020), 3840–3854.
[181]
Sridhar Swaminathan, Deepak Garg, Rajkumar Kannan, and Frédéric Andrès. 2020. Sparse low rank factorization for deep neural network compression. Neurocomputing 398 (2020), 185–196.
[182]
Yajiao Tang, Junkai Ji, Yulin Zhu, Shangce Gao, Zheng Tang, and Yuki Todo. 2019. A differential evolution-oriented pruning neural network model for bankruptcy prediction. In Complexity, Vol. 2019. 8682124:1–8682124:21.
[183]
Hassan Tariq, Elf Eldridge, and Ian Welch. 2018. An efficient approach for feature construction of high-dimensional microarray data by random projections. PLoS One 13, 4 (2018), e0196385.
[184]
Akbar Telikani, Amirhessam Tahmassebi, Wolfgang Banzhaf, and Amir H. Gandomi. 2021. Evolutionary machine learning: A survey. ACM Comput. Surv. 54, 8 (2021), 1–35.
[185]
Astro Teller and Manuela Veloso. 1996. PADO: A new learning architecture for object recognition. Symbol. Visual Learn. 4, 18 (1996), 81–116.
[186]
Haiman Tian, ShuChing Chen, MeiLing Shyu, and Stuart Harvey Rubin. 2019. Automated neural network construction with similarity sensitive evolutionary algorithms. In Proceedings of the IEEE International Conference on Information Reuse and Integration for Data Science.283–290.
[187]
Binh Tran, Bing Xue, and Mengjie Zhang. 2018. A new representation in PSO for discretization-based feature selection. IEEE Trans. Cybern. 48, 6 (2018), 1733–1746.
[188]
Binh Tran, Bing Xue, and Mengjie Zhang. 2019. Genetic programming for multiple-feature construction on high-dimensional classification. Pattern Recognit. 93 (2019), 404–417.
[189]
Binh Tran, Mengjie Zhang, and Bing Xue. 2016. Multiple feature construction in classification on high-dimensional data using GP. In IEEE Symposium Series on Computational Intelligence. 1–8.
[190]
Haleh Vafaie and Kenneth De Jong. 1998. Feature space transformation using genetic algorithms. IEEE Intell. Syst. Applic. 13, 2 (1998), 57–65.
[191]
Gustavo A. Vargas-Hákim, Efrén Mezura-Montes, and Héctor-Gabriel Acosta-Mesa. 2022. A review on convolutional neural networks encodings for neuroevolution. IEEE Trans. Evol. Comput. 26, 1 (2022), 12–27.
[192]
Susana M. Vieira, Luís F. Mendonça, Goncalo J. Farinha, and João M. C. Sousa. 2013. Modified binary PSO for feature selection using SVM applied to mortality prediction of septic patients. Appl. Soft Comput. 13, 8 (2013), 3494–3504.
[193]
Bin Wang, Bing Xue, and Mengjie Zhang. 2020. Particle swarm optimization for evolving deep convolutional neural networks for image classification: Single- and multi-objective approaches. In Deep Neural Evolution. Springer, 155–184.
[194]
Bin Wang, Bing Xue, and Mengjie Zhang. 2020. Particle swarm optimization for evolving deep neural networks for image classification by evolving and stacking transferable blocks. In Proceedings of the IEEE Congress on Evolutionary Computation.1–8.
[195]
Bin Wang, Bing Xue, and Mengjie Zhang. 2021. Surrogate-assisted particle swarm optimization for evolving variable-length transferable blocks for image classification. IEEE Trans. Neural Netw. Learn. Syst. 33, 8 (2021), 3727–3740.
[196]
Jiaxing Wang, Haoli Bai, Jiaxiang Wu, and Jian Cheng. 2020. Bayesian automatic model compression. IEEE J. Select. Topics Signal Process. 14, 4 (2020), 727–736.
[197]
Shuyan Wang, Chunyan Wen, and Jiaze Sun. 2016. Test data augmentation method based on adaptive particle swarm optimization algorithm. J. Netw. Comput. Applic. 36, 9 (2016), 2492.
[198]
Xiao-han Wang, Yong Zhang, and Xiao-yan Sun. 2020. Multi-objective feature selection based on artificial bee colony: An acceleration approach with variable sample size. Appl. Soft Comput. 88 (2020), 106041.
[199]
Yunhe Wang, Chang Xu, Jiayan Qiu, Chao Xu, and Dacheng Tao. 2018. Towards evolutionary compression. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery & Data Mining.2476–2485.
[200]
Yun Wen and Hua Xu. 2011. A cooperative coevolution-based Pittsburgh learning classifier system embedded with memetic feature selection. In Proceedings of the IEEE Congress on Evolutionary Computation.2415–2422.
[201]
Colin White, Willie Neiswanger, and Yash Savani. 2021. BANANAS: Bayesian optimization with neural architectures for neural architecture search. In Proceedings of the AAAI Conference on Artificial Intelligence. 10293–10301.
[202]
Genta Indra Winata, Andrea Madotto, Jamin Shin, Elham J. Barezi, and Pascale Fung. 2019. On the effectiveness of low-rank matrix factorization for LSTM model compression. arXiv abs/1908.09982 (2019).
[203]
Min Wu, Wanjuan Su, Luefeng Chen, and Zhentao Liu. 2021. Weight-adapted convolution neural network for facial expression recognition in human-robot interaction. IEEE Trans. Syst. Man Cybern. 51, 3 (2021), 1473–1484.
[204]
Tao Wu, Xiaoyang Li, Deyun Zhou, Na Li, and Jiao Shi. 2021. Differential evolution based layer-wise weight pruning for compressing deep neural networks. Sensors 21, 3 (2021), 880.
[205]
Lingxi Xie, Xin Chen, Kaifeng Bi, Longhui Wei, Yuhui Xu, Zhengsu Chen, Lanfei Wang, Anxiang Xiao, Jianlong Chang, Xiaopeng Zhang, and Qi Tian. 2022. Weight-sharing neural architecture search: A battle to shrink the optimization gap. ACM Comput. Surv. 54, 9 (2022), 1–37.
[206]
Lingxi Xie and Alan Yuille. 2017. Genetic CNN. In Proceedings of the IEEE International Conference on Computer Vision.1379–1388.
[207]
Bing Xue, Mengjie Zhang, and Will N. Browne. 2012. Multi-objective particle swarm optimization (PSO) for feature selection. In Proceedings of the Genetic Evolutionary Computation Conference.81–88.
[208]
Bing Xue, Mengjie Zhang, Will N. Browne, and Xin Yao. 2015. A survey on evolutionary computation approaches to feature selection. IEEE Trans. Evol. Comput. 20, 4 (2015), 606–626.
[209]
Bing Xue, Mengjie Zhang, Yan Dai, and Will N. Browne. 2013. PSO for feature construction and binary classification. In Proceedings of the Genetic Evolutionary Computation Conference.137–144.
[210]
Shangshang Yang, Ye Tian, Cheng He, Xingyi Zhang, Kay Chen Tan, and Yaochu Jin. 2021. A gradient-guided evolutionary approach to training deep neural networks. IEEE Trans. Neural Netw. Learn. Syst. 33, 9 (2021), 4861–4875.
[211]
Ziqing Yang, Yiming Cui, Xin Yao, and Shijin Wang. 2022. Gradient-based intra-attention pruning on pre-trained language models. arXiv preprint arXiv:2212.07634 (2022).
[212]
Zhaohui Yang, Yunhe Wang, Xinghao Chen, Boxin Shi, and Chao Xu. 2020. CARS: Continuous evolution for efficient neural architecture search. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.1826–1835.
[213]
Quanming Yao, Mengshuo Wang, Yuqiang Chen, Wenyuan Dai, and Yu-Feng Li. 2018. Taking human out of learning applications: A survey on automated machine learning. arXiv preprint arXiv:1810.13306 (2018).
[214]
Xin Yao. 1993. A review of evolutionary artificial neural networks. Int. J. Intell. Syst. 8, 4 (1993), 539–567.
[215]
Xin Yao and Yong Liu. 1996. Ensemble structure of evolutionary artificial neural networks. In Proceedings of the Genetic Evolutionary Computation Conference.659–664.
[216]
Guo Yu, Yaochu Jin, and Markus Olhofer. 2019. Benchmark problems and performance indicators for search of knee points in multiobjective optimization. IEEE Trans. Cybern. 50, 8 (2019), 3531–3544.
[217]
Guo Yu, Yaochu Jin, and Markus Olhofer. 2020. A multiobjective evolutionary algorithm for finding knee regions using two localized dominance relationships. IEEE Trans. Evol. Comput. 25, 1 (2020), 145–158.
[218]
Guo Yu, Yaochu Jin, Markus Olhofer, Qiqi Liu, and Wenli Du. 2021. Solution set augmentation for knee identification in multiobjective decision analysis. IEEE Trans. Cybern. 53, 4 (2021), 2480–2493.
[219]
Guo Yu, Lianbo Ma, Yaochu Jin, Wenli Du, Qiqi Liu, and Hengmin Zhang. 2022. A survey on knee-oriented multiobjective evolutionary optimization. IEEE Trans. Evol. Comput. 26, 6 (2022), 1452–1472.
[220]
Hualong Yu, Guochang Gu, Haibo Liu, Jing Shen, and Jing Zhao. 2009. A modified ant colony optimization algorithm for tumor marker gene selection. Genom., Proteom. Bioinform. 7, 4 (2009), 200–208.
[221]
Yang Yu. 2018. Towards sample efficient reinforcement learning. In Proceedings of the International Joint Conference on Artificial Intelligence.5739–5743.
[222]
Emigdio Z.-Flores, Leonardo Trujillo, Pierrick Legrand, and Frédérique Faïta-Aïnseba. 2020. EEG feature extraction using genetic programming for the classification of mental states. Algorithms 13, 9 (2020), 221.
[223]
Zhi-Hui Zhan, Jian-Yu Li, and Jun Zhang. 2022. Evolutionary deep learning: A survey. Neurocomputing 483 (2022), 42–58.
[224]
Byoung-Tak Zhang and Heinz Mühlenbein. 1995. Balancing accuracy and parsimony in genetic programming. Evol. Comput. 3, 1 (1995), 17–38.
[225]
Haoling Zhang, Chao-Han Huck Yang, Hector Zenil, Narsis Aftab Kiani, Yue Shen, and Jesper N. Tegner. 2020. Evolving neural networks through a reverse encoding tree. In Proceedings of the IEEE Congress on Evolutionary Computation.1–10.
[226]
Jiawei Zhang and Fisher B. Gouza. 2018. GADAM: Genetic-evolutionary ADAM for deep neural network optimization. arXiv preprint arXiv:1805.07500 (2018).
[227]
Mengjie Zhang. 2018. Evolutionary deep learning for image analysis. Retrieved from https://ieeetv.ieee.org/mengjie--zhang--evolutionary--deep--learning--for--image--analysis.
[228]
Mengjie Zhang and Stefano Cagnoni. 2020. Evolutionary computation and evolutionary deep learning for image analysis, signal processing and pattern recognition. In Proceedings of the Genetic Evolutionary Computation Conference.1221–1257.
[229]
Mengjie Zhang and Will Smart. 2004. Genetic programming with gradient descent search for multiclass object classification. In Proceedings of the European Conference on Genetic Programming. 399–408.
[230]
Yong Zhang, Dun-wei Gong, Xiao-yan Sun, and Yi-nan Guo. 2017. A PSO-based multi-objective multi-label feature selection method in classification. Sci. Rep. 7, 1 (2017), 1–12.
[231]
Yang Zhang and Peter I. Rockett. 2011. A generic optimising feature extraction method using multiobjective genetic programming. Appl. Soft Comput. 11, 1 (2011), 1087–1097.
[232]
Qijun Zhao, David Zhang, and Hongtao Lu. 2006. A direct evolutionary feature extraction algorithm for classifying high dimensional data. In Proceedings of the AAAI Conference on Artificial Intelligence. 561–566.
[233]
Qijun Zhao, David Dian Zhang, Lei Zhang, and Hongtao Lu. 2009. Evolutionary discriminant feature extraction with application to face recognition. EURASIP J. Adv. Signal Process. 2009 (2009), 1–12.
[234]
Tianwen Zhao, Qijun Zhao, Hongtao Lu, and David Dian Zhang. 2007. Bagging evolutionary feature extraction algorithm for classification. In Proceedings of the International Conference on Natural Computation. 540–545.
[235]
Xun Zhou, A. K. Qin, Maoguo Gong, and Kay Chen Tan. 2021. A survey on evolutionary construction of deep neural networks. IEEE Trans. Evol. Comput. 25, 5 (2021), 894–912.
[236]
Yao Zhou, Bing Hu, Xianglei Yuan, Kaide Huang, Zhang Yi, and Gary G. Yen. 2023. Multi-objective evolutionary generative adversarial network compression for image translation. IEEE Trans. Evol. Comput. (2023). Early Access. DOI:DOI:
[237]
Yao Zhou, Gary G. Yen, and Zhang Yi. 2020. Evolutionary compression of deep neural networks for biomedical image segmentation. IEEE Trans. Neural Netw. Learn. Syst. 31, 8 (2020), 2916–2929.
[238]
Yao Zhou, Gary G. Yen, and Zhang Yi. 2021. Evolutionary shallowing deep neural networks at block levels. IEEE Trans. Neural Netw. Learn. Syst. 33, 9 (2021), 4635–4647.
[239]
Yao Zhou, Gary G. Yen, and Zhang Yi. 2021. A knee-guided evolutionary algorithm for compressing deep neural networks. IEEE Trans. Cybern. 51, 3 (2021), 1626–1638.
[240]
Hui Zhu, Zhulin An, Chuanguang Yang, Kaiqiang Xu, and Yongjun Xu. 2019. EENA: Efficient evolution of neural architecture. In Proceedings of the IEEE International Conference on Computer Vision.1891–1899.
[241]
Hangyu Zhu and Yaochu Jin. 2022. Real-time federated evolutionary neural architecture search. IEEE Trans. Evol. Comput. 26, 2 (2022), 364–378.
[242]
Barret Zoph and Quoc V. Le. 2016. Neural architecture search with reinforcement learning. arXiv preprint arXiv:1611.01578 (2016).

Cited By

View all
  • (2025)A clustering and vector angle-based adaptive evolutionary algorithm for multi-objective optimization with irregular Pareto frontsThe Journal of Supercomputing10.1007/s11227-024-06496-w81:1Online publication date: 1-Jan-2025
  • (2024)Nature-Inspired Intelligent Computing: A Comprehensive SurveyResearch10.34133/research.04427Online publication date: 16-Aug-2024
  • (2024)Enhanced LSTM-based robotic agent for load forecasting in low-voltage distributed photovoltaic power distribution networkFrontiers in Neurorobotics10.3389/fnbot.2024.143164318Online publication date: 11-Jul-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Computing Surveys
ACM Computing Surveys  Volume 56, Issue 2
February 2024
974 pages
EISSN:1557-7341
DOI:10.1145/3613559
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 15 September 2023
Online AM: 07 June 2023
Accepted: 31 May 2023
Revised: 06 April 2023
Received: 21 August 2022
Published in CSUR Volume 56, Issue 2

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Deep learning
  2. evolutionary computation
  3. data preparation
  4. model generation
  5. model deployment

Qualifiers

  • Survey

Funding Sources

  • National Natural Science Foundation of China
  • National Key R&D Program of China
  • Project funded by China Postdoctoral Science Foundation
  • Joint Funds of the Natural Science Foundation of Liaoning Province
  • Fundamental Research Funds for the Central Universities

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1,669
  • Downloads (Last 6 weeks)151
Reflects downloads up to 21 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2025)A clustering and vector angle-based adaptive evolutionary algorithm for multi-objective optimization with irregular Pareto frontsThe Journal of Supercomputing10.1007/s11227-024-06496-w81:1Online publication date: 1-Jan-2025
  • (2024)Nature-Inspired Intelligent Computing: A Comprehensive SurveyResearch10.34133/research.04427Online publication date: 16-Aug-2024
  • (2024)Enhanced LSTM-based robotic agent for load forecasting in low-voltage distributed photovoltaic power distribution networkFrontiers in Neurorobotics10.3389/fnbot.2024.143164318Online publication date: 11-Jul-2024
  • (2024)Residual learning-based robotic image analysis model for low-voltage distributed photovoltaic fault identification and positioningFrontiers in Neurorobotics10.3389/fnbot.2024.139697918Online publication date: 22-Apr-2024
  • (2024)A Knee Point-Driven Many-Objective Evolutionary Algorithm with Adaptive Switching MechanismJournal of Applied Mathematics10.1155/2024/47376042024(1-43)Online publication date: 3-Jan-2024
  • (2024)Cooperative Coevolutionary Spatial Topologies for Autoencoder TrainingProceedings of the Genetic and Evolutionary Computation Conference10.1145/3638529.3654127(331-339)Online publication date: 14-Jul-2024
  • (2024)A Novel Fuzzy Neural Network Architecture Search Framework for Defect Recognition With UncertaintiesIEEE Transactions on Fuzzy Systems10.1109/TFUZZ.2024.337379232:5(3274-3285)Online publication date: May-2024
  • (2024)Estimation of Distribution Algorithms in Machine Learning: A SurveyIEEE Transactions on Evolutionary Computation10.1109/TEVC.2023.331410528:5(1301-1321)Online publication date: Oct-2024
  • (2024)SISSRM: Sequentially Induced Signed Subnetwork Reconstruction Model for Generating Realistic Synthetic Signed NetworksIEEE Transactions on Computational Social Systems10.1109/TCSS.2024.339261311:5(6476-6486)Online publication date: Oct-2024
  • (2024)Single-Domain Generalized Predictor for Neural Architecture Search SystemIEEE Transactions on Computers10.1109/TC.2024.336594973:5(1400-1413)Online publication date: May-2024
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media