Nothing Special   »   [go: up one dir, main page]

skip to main content
survey

Weight-Sharing Neural Architecture Search: A Battle to Shrink the Optimization Gap

Published: 08 October 2021 Publication History

Abstract

Neural architecture search (NAS) has attracted increasing attention. In recent years, individual search methods have been replaced by weight-sharing search methods for higher search efficiency, but the latter methods often suffer lower instability. This article provides a literature review on these methods and owes this issue to the optimization gap. From this perspective, we summarize existing approaches into several categories according to their efforts in bridging the gap, and we analyze both advantages and disadvantages of these methodologies. Finally, we share our opinions on the future directions of NAS and AutoML. Due to the expertise of the authors, this article mainly focuses on the application of NAS to computer vision problems.

References

[1]
George Adam and Jonathan Lorraine. 2019. Understanding neural architecture search techniques. Retrieved from https://arXiv:1904.00438.
[2]
Karim Ahmed and Lorenzo Torresani. 2018. Maskconnect: Connectivity learning by gradient descent. In Proceedings of the European Conference on Computer Vision.
[3]
Youhei Akimoto, Shinichi Shirakawa, Nozomu Yoshinari, Kento Uchida, Shota Saito, and Kouhei Nishida. 2019. Adaptive stochastic natural gradient method for one-shot neural architecture search. Retrieved from https://arXiv:1905.08537.
[4]
Jie An, Haoyi Xiong, Jun Huan, and Jiebo Luo. 2020. Ultrafast photorealistic style transfer via neural architecture search. In Proceedings of the AAAI Conference on Artificial Intelligence.
[5]
Marcin Andrychowicz, Misha Denil, Sergio Gomez, Matthew W. Hoffman, David Pfau, Tom Schaul, Brendan Shillingford, and Nando De Freitas. 2016. Learning to learn by gradient descent by gradient descent. In Advances in Neural Information Processing Systems. MIT Press.
[6]
Randy Ardywibowo, Shahin Boluki, Xinyu Gong, Zhangyang Wang, and Xiaoning Qian. 2020. NADS: Neural architecture distribution search for uncertainty awareness. Retrieved from https://arXiv:2006.06646.
[7]
Woong Bae, Seungho Lee, Yeha Lee, Beomhee Park, Minki Chung, and Kyu-Hwan Jung. 2019. Resource optimized neural architecture search for 3D medical image segmentation. In Medical Image Computing and Computer-Assisted Intervention. Springer.
[8]
Bowen Baker, Otkrist Gupta, Nikhil Naik, and Ramesh Raskar. 2017. Designing neural network architectures using reinforcement learning. In Proceedings of the International Conference on Learning Representations.
[9]
Bowen Baker, Otkrist Gupta, Ramesh Raskar, and Nikhil Naik. 2017. Accelerating neural architecture search using performance prediction. Retrieved from https://arXiv:1705.10823.
[10]
Prasanna Balaprakash, Romain Egele, Misha Salim, Stefan Wild, Venkatram Vishwanath, Fangfang Xia, Tom Brettin, and Rick Stevens. 2019. Scalable reinforcement-learning-based neural architecture search for cancer deep learning research. In Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis.
[11]
Maria Baldeon-Calisto and Susana K. Lai-Yuen. 2020. AdaResU-Net: Multiobjective adaptive convolutional neural network for medical image segmentation. Neurocomputing 392 (2020), 325–340.
[12]
Ahmed Baruwa, Mojeed Abisiga, Ibrahim Gbadegesin, and Afeez Fakunle. 2019. Leveraging end-to-end speech recognition with neural architecture search. Retrieved from https://arXiv:1912.05946.
[13]
Pouya Bashivan, Mark Tensen, and James J. DiCarlo. 2019. Teacher guided architecture search. In Proceedings of the International Conference on Computer Vision.
[14]
Justin Bayer, Daan Wierstra, Julian Togelius, and Jürgen Schmidhuber. 2009. Evolving memory cell structures for sequence learning. In Proceedings of the International Conference on Artificial Neural Networks.
[15]
Irwan Bello, Barret Zoph, Vijay Vasudevan, and Quoc V. Le. 2017. Neural optimizer search with reinforcement learning. In Proceedings of the International Conference on Machine Learning.
[16]
Gabriel Bender, Pieter-Jan Kindermans, Barret Zoph, Vijay Vasudevan, and Quoc Le. 2018. Understanding and simplifying one-shot architecture search. In Proceedings of the International Conference on Machine Learning.
[17]
Gabriel Bender, Hanxiao Liu, Bo Chen, Grace Chu, Shuyang Cheng, Pieter-Jan Kindermans, and Quoc V. Le. 2020. Can weight sharing outperform random architecture search? An investigation with TuNAS. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[18]
James Bergstra, Daniel Yamins, and David Cox. 2013. Making a science of model search: Hyperparameter optimization in hundreds of dimensions for vision architectures. In Proceedings of the International Conference on Machine Learning.
[19]
James S. Bergstra, Rémi Bardenet, Yoshua Bengio, and Balázs Kégl. 2011. Algorithms for hyper-parameter optimization. In Advances in Neural Information Processing Systems. MIT Press.
[20]
Kaifeng Bi, Changping Hu, Lingxi Xie, Xin Chen, Longhui Wei, and Qi Tian. 2019. Stabilizing DARTS with amended gradient estimation on architectural parameters. Retrieved from https://arXiv:1910.11831.
[21]
Kaifeng Bi, Lingxi Xie, Xin Chen, Longhui Wei, and Qi Tian. 2020. GOLD-NAS: Gradual, one-level, differentiable. Retrieved from https://arXiv:2007.03331.
[22]
Simone Bianco, Marco Buzzelli, Gianluigi Ciocca, and Raimondo Schettini. 2020. Neural architecture search for image saliency fusion. Info. Fusion 57 (2020), 89–101.
[23]
Andrew Brock, Theodore Lim, James M. Ritchie, and Nick Weston. 2017. SMASH: One-shot model architecture search through hypernetworks. Retrieved from https://arXiv:1708.05344.
[24]
Han Cai, Tianyao Chen, Weinan Zhang, Yong Yu, and Jun Wang. 2018. Efficient architecture search by network transformation. In Proceedings of the AAAI Conference on Artificial Intelligence.
[25]
Han Cai, Chuang Gan, and Song Han. 2020. Once for all: Train one network and specialize it for efficient deployment. In Proceedings of the International Conference on Learning Representations.
[26]
Han Cai, Tianzhe Wang, Zhanghao Wu, Kuan Wang, Ji Lin, and Song Han. 2019. On-device image classification with proxyless neural architecture search and quantization-aware fine-tuning. In Proceedings of the International Conference on Computer Vision Workshops.
[27]
Han Cai, Jiacheng Yang, Weinan Zhang, Song Han, and Yong Yu. 2018. Path-level network transformation for efficient architecture search. In Proceedings of the International Conference on Machine Learning.
[28]
Han Cai, Ligeng Zhu, and Song Han. 2019. Proxylessnas: Direct neural architecture search on target task and hardware. In Proceedings of the International Conference on Learning Representations.
[29]
Zhaowei Cai and Nuno Vasconcelos. 2020. Rethinking differentiable search for mixed-precision neural networks. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[30]
Fabio Maria Carlucci, Pedro Esperanca, Rasul Tutunov, Marco Singh, Victor Gabillon, Antoine Yang, Hang Xu, Zewei Chen, and Jun Wang. 2019. MANAS: Multi-agent neural architecture search. Retrieved from https://arXiv:1909.01051.
[31]
Francesco Paolo Casale, Jonathan Gordon, and Nicolo Fusi. 2019. Probabilistic neural architecture search. Retrieved from https://arXiv:1902.05116.
[32]
Jianlong Chang, Yiwen Guo, Gaofeng Meng, Shiming Xiang, Chunhong Pan et al. 2019. DATA: Differentiable architecture approximation. In Advances in Neural Information Processing Systems. MIT Press.
[33]
Bo Chen, Golnaz Ghiasi, Hanxiao Liu, Tsung-Yi Lin, Dmitry Kalenichenko, Hartwig Adam, and Quoc V. Le. 2020. Mnasfpn: Learning latency-aware pyramid architecture for object detection on mobile devices. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[34]
Daoyuan Chen, Yaliang Li, Minghui Qiu, Zhen Wang, Bofang Li, Bolin Ding, Hongbo Deng, Jun Huang, Wei Lin, and Jingren Zhou. 2020. Adabert: Task-adaptive bert compression with differentiable neural architecture search. Retrieved from https://arXiv:2001.04246.
[35]
Hanlin Chen, Baochang Zhang Li’an Zhuo, Xiawu Zheng, Jianzhuang Liu, David Doermann, and Rongrong Ji. 2020. Binarized neural architecture search. In Proceedings of the AAAI Conference on Artificial Intelligence.
[36]
Junkun Chen, Kaiyu Chen, Xinchi Chen, Xipeng Qiu, and Xuanjing Huang. 2018. Exploring shared structures and hierarchies for multiple nlp tasks. Retrieved from https://arXiv:1808.07658.
[37]
Liang-Chieh Chen, Maxwell Collins, Yukun Zhu, George Papandreou, Barret Zoph, Florian Schroff, Hartwig Adam, and Jon Shlens. 2018. Searching for efficient multi-scale architectures for dense image prediction. In Advances in Neural Information Processing Systems. MIT Press.
[38]
Liang-Chieh Chen, George Papandreou, Iasonas Kokkinos, Kevin Murphy, and Alan L. Yuille. 2017. Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected crfs. IEEE Trans. Pattern Anal. Mach. Intell. 40, 4 (2017), 834–848.
[39]
Shoufa Chen, Yunpeng Chen, Shuicheng Yan, and Jiashi Feng. 2019. Efficient differentiable neural architecture search with meta kernels. Retrieved from https://arXiv:1912.04749.
[40]
Tianqi Chen, Ian Goodfellow, and Jonathon Shlens. 2016. Net2net: Accelerating learning via knowledge transfer. In Proceedings of the International Conference on Learning Representations.
[41]
Wenlin Chen, James Wilson, Stephen Tyree, Kilian Weinberger, and Yixin Chen. 2015. Compressing neural networks with the hashing trick. In Proceedings of the International Conference on Machine Learning.
[42]
Xin Chen, Yawen Duan, Zewei Chen, Hang Xu, Zihao Chen, Xiaodan Liang, Tong Zhang, and Zhenguo Li. 2020. CATCH: Context-based meta reinforcement learning for transferrable architecture search. In Proceedings of the European Conference on Computer Vision.
[43]
Xiangning Chen and Cho-Jui Hsieh. 2020. Stabilizing differentiable architecture search via perturbation-based regularization. Retrieved from https://arXiv:2002.05283.
[44]
Xin Chen, Lingxi Xie, Jun Wu, and Qi Tian. 2019. Progressive differentiable architecture search: Bridging the depth gap between search and evaluation. In Proceedings of the International Conference on Computer Vision.
[45]
Xin Chen, Lingxi Xie, Jun Wu, Longhui Wei, Yuhui Xu, and Qi Tian. 2020. Fitting the search space of weight-sharing NAS with graph convolutional networks. Retrieved from https://arXiv:2004.08423.
[46]
Yukang Chen, Gaofeng Meng, Qian Zhang, Shiming Xiang, Chang Huang, Lisen Mu, and Xinggang Wang. 2019. Renas: Reinforced evolutionary neural architecture search. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[47]
Yukang Chen, Gaofeng Meng, Qian Zhang, Xinbang Zhang, Liangchen Song, Shiming Xiang, and Chunhong Pan. 2018. Joint neural architecture search and quantization. Retrieved from https://arXiv:1811.09426.
[48]
Yukang Chen, Tong Yang, Xiangyu Zhang, Gaofeng Meng, Xinyu Xiao, and Jian Sun. 2019. DetNAS: Backbone search for object detection. In Advances in Neural Information Processing Systems. MIT Press.
[49]
Yushi Chen, Kaiqiang Zhu, Lin Zhu, Xin He, Pedram Ghamisi, and Jón Atli Benediktsson. 2019. Automatic design of convolutional neural network for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 57, 9 (2019), 7048–7066.
[50]
An-Chieh Cheng, Jin-Dong Dong, Chi-Hung Hsu, Shu-Huan Chang, Min Sun, Shih-Chieh Chang, Jia-Yu Pan, Yu-Ting Chen, Wei Wei, and Da-Cheng Juan. 2018. Searching toward pareto-optimal device-aware neural architectures. In Proceedings of the International Conference on Computer-Aided Design.
[51]
An-Chieh Cheng, Chieh Hubert Lin, Da-Cheng Juan, Wei Wei, and Min Sun. 2020. InstaNAS: Instance-aware neural architecture search. In Proceedings of the AAAI Conference on Artificial Intelligence.
[52]
Hsin-Pai Cheng, Tunhou Zhang, Yukun Yang, Feng Yan, Shiyu Li, Harris Teague, Hai Li, and Yiran Chen. 2019. SwiftNet: Using graph propagation as meta-knowledge to search highly representative neural architectures. Retrieved from https://arXiv:1906.08305.
[53]
Hsin-Pai Cheng, Tunhou Zhang, Yukun Yang, Feng Yan, Harris Teague, Yiran Chen, and Hai Li. 2019. Msnet: Structural wired neural architecture search for internet of things. In Proceedings of the International Conference on Computer Vision Workshops.
[54]
Weiyu Cheng, Yanyan Shen, and Linpeng Huang. 2020. Differentiable neural input search for recommender systems. Retrieved from https://arXiv:2006.04466.
[55]
Minsu Cho, Mohammadreza Soltani, and Chinmay Hegde. 2019. One-shot neural architecture search via compressive sensing. Retrieved from https://arXiv:1906.02869.
[56]
François Chollet. 2017. Xception: Deep learning with depthwise separable convolutions. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[57]
Xiangxiang Chu, Bo Zhang, Jixiang Li, Qingyuan Li, and Ruijun Xu. 2019. Scarletnas: Bridging the gap between scalability and fairness in neural architecture search. Retrieved from https://arXiv:1908.06022.
[58]
Xiangxiang Chu, Bo Zhang, Hailong Ma, Ruijun Xu, Jixiang Li, and Qingyuan Li. 2019. Fast, accurate and lightweight super-resolution with neural architecture search. Retrieved from https://arXiv:1901.07261.
[59]
Xiangxiang Chu, Bo Zhang, and Ruijun Xu. 2020. MoGA: Searching beyond MobileNetV3. In Proceedings of the International Conference on Acoustics, Speech and Signal Processing.
[60]
Xiangxiang Chu, Bo Zhang, Ruijun Xu, and Jixiang Li. 2019. Fairnas: Rethinking evaluation fairness of weight sharing neural architecture search. Retrieved from https://arXiv:1907.01845.
[61]
Xiangxiang Chu, Bo Zhang, Ruijun Xu, and Hailong Ma. 2019. Multi-objective reinforced evolution in mobile neural architecture search. Retrieved from https://arXiv:1901.01074.
[62]
Xiangxiang Chu, Tianbao Zhou, Bo Zhang, and Jixiang Li. 2020. Fair darts: Eliminating unfair advantages in differentiable architecture search. In Proceedings of the European Conference on Computer Vision.
[63]
Marc Claesen and Bart De Moor. 2015. Hyperparameter search in machine learning. Retrieved from https://arXiv:1502.02127.
[64]
Corinna Cortes, Xavier Gonzalvo, Vitaly Kuznetsov, Mehryar Mohri, and Scott Yang. 2017. Adanet: Adaptive structural learning of artificial neural networks. In Proceedings of the International Conference on Machine Learning.
[65]
Ekin D. Cubuk, Barret Zoph, Dandelion Mane, Vijay Vasudevan, and Quoc V. Le. 2019. Autoaugment: Learning augmentation policies from data. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[66]
Ekin D. Cubuk, Barret Zoph, Jonathon Shlens, and Quoc V. Le. 2020. Randaugment: Practical automated data augmentation with a reduced search space. In Proceedings of the Computer Vision and Pattern Recognition Workshops.
[67]
Xiaoliang Dai, Peizhao Zhang, Bichen Wu, Hongxu Yin, Fei Sun, Yanghan Wang, Marat Dukhan, Yunqing Hu, Yiming Wu, Yangqing Jia et al. 2019. Chamnet: Towards efficient network design through platform-aware model adaptation. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[68]
Boyang Deng, Junjie Yan, and Dahua Lin. 2017. Peephole: Predicting network performance before training. Retrieved from https://arXiv:1712.03351.
[69]
Jia Deng, Wei Dong, Richard Socher, Li-Jia Li, Kai Li, and Li Fei-Fei. 2009. Imagenet: A large-scale hierarchical image database. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[70]
Georgi Dikov, Patrick van der Smagt, and Justin Bayer. 2019. Bayesian learning of neural network architectures. Retrieved from https://arXiv:1901.04436.
[71]
Shaojin Ding, Tianlong Chen, Xinyu Gong, Weiwei Zha, and Zhangyang Wang. 2020. AutoSpeech: Neural architecture search for speaker recognition. Retrieved from https://arXiv:2005.03215.
[72]
Shifei Ding, Hui Li, Chunyang Su, Junzhao Yu, and Fengxiang Jin. 2013. Evolutionary artificial neural networks: A review. Artific. Intell. Rev. 39, 3 (2013), 251–260.
[73]
Hongwei Dong, Bin Zou, Lamei Zhang, and Siyu Zhang. 2020. Automatic design of CNNs via differentiable neural architecture search for PolSAR image classification. IEEE Trans. Geosci. Remote Sens. 58, 9 (2020), 6362–6375.
[74]
Jin-Dong Dong, An-Chieh Cheng, Da-Cheng Juan, Wei Wei, and Min Sun. 2018. Dpp-net: Device-aware progressive search for pareto-optimal neural architectures. In Proceedings of the European Conference on Computer Vision.
[75]
Nanqing Dong, Min Xu, Xiaodan Liang, Yiliang Jiang, Wei Dai, and Eric Xing. 2019. Neural architecture search for adversarial medical image segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention. Springer, 828–836.
[76]
Xuanyi Dong, Mingxing Tan, Adams Wei Yu, Daiyi Peng, Bogdan Gabrys, and Quoc V. Le. 2020. AutoHAS: Differentiable hyper-parameter and architecture search. Retrieved from https://arXiv:2006.03656.
[77]
Xuanyi Dong and Yi Yang. 2019. Network pruning via transformable architecture search. In Advances in Neural Information Processing Systems. MIT Press.
[78]
Xuanyi Dong and Yi Yang. 2019. One-shot neural architecture search via self-evaluated template network. In Proceedings of the International Conference on Computer Vision.
[79]
Xuanyi Dong and Yi Yang. 2019. Searching for a robust neural architecture in four GPU hours. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[80]
Xuanyi Dong and Yi Yang. 2020. Nas-bench-201: Extending the scope of reproducible neural architecture search. In Proceedings of the International Conference on Learning Representations.
[81]
Sivan Doveh, Eli Schwartz, Chao Xue, Rogerio Feris, Alex Bronstein, Raja Giryes, and Leonid Karlinsky. 2019. MetAdapt: Meta-learned task-adaptive architecture for few-shot classification. Retrieved from https://arXiv:1912.00412.
[82]
Xianzhi Du, Tsung-Yi Lin, Pengchong Jin, Golnaz Ghiasi, Mingxing Tan, Yin Cui, Quoc V. Le, and Xiaodan Song. 2020. SpineNet: Learning scale-permuted backbone for recognition and localization. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[83]
Alina Dubatovka, Efi Kokiopoulou, Luciano Sbaiz, Andrea Gesmundo, Gabor Bartok, and Jesse Berent. 2019. Ranking architectures using meta-learning. Retrieved from https://arXiv:1911.11481.
[84]
Jayanta K. Dutta, Jiayi Liu, Unmesh Kurup, and Mohak Shah. 2018. Effective building block design for deep convolutional neural networks using search. Retrieved from https://arXiv:1801.08577.
[85]
Thomas Elsken, Jan-Hendrik Metzen, and Frank Hutter. 2017. Simple and efficient architecture search for convolutional neural networks. Retrieved from https://arXiv:1711.04528.
[86]
Thomas Elsken, Jan Hendrik Metzen, and Frank Hutter. 2019. Efficient multi-objective neural architecture search via lamarckian evolution. In Proceedings of the International Conference on Learning Representations.
[87]
Thomas Elsken, Jan Hendrik Metzen, and Frank Hutter. 2019. Neural architecture search: A survey. J. Mach. Learn. Res. 20, 55 (2019), 1–21.
[88]
Jiemin Fang, Yukang Chen, Xinbang Zhang, Qian Zhang, Chang Huang, Gaofeng Meng, Wenyu Liu, and Xinggang Wang. 2019. EAT-NAS: Elastic architecture transfer for accelerating large-scale neural architecture search. Retrieved from https://arXiv:1901.05884.
[89]
Jiemin Fang, Yuzhu Sun, Kangjian Peng, Qian Zhang, Yuan Li, Wenyu Liu, and Xinggang Wang. 2020. Fast neural network adaptation via parameter remapping and architecture search. In Proceedings of the International Conference on Learning Representations.
[90]
Jiemin Fang, Yuzhu Sun, Qian Zhang, Yuan Li, Wenyu Liu, and Xinggang Wang. 2020. Densely connected search space for more flexible neural architecture search. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[91]
Igor Fedorov, Ryan P. Adams, Matthew Mattina, and Paul Whatmough. 2019. Sparse: Sparse architecture search for cnns on resource-constrained microcontrollers. In Advances in Neural Information Processing Systems. MIT Press.
[92]
Javier Fernandez-Marques, Paul N. Whatmough, Andrew Mundy, and Matthew Mattina. 2020. Searching for winograd-aware quantized networks. Retrieved from https://arXiv:2002.10711.
[93]
Matthias Feurer and Frank Hutter. 2019. Hyperparameter optimization. In Automated Machine Learning. Springer, Cham, 3–33.
[94]
Matthias Feurer, Jost Tobias Springenberg, and Frank Hutter. 2015. Initializing bayesian hyperparameter optimization via meta-learning. In Proceedings of the AAAI Conference on Artificial Intelligence.
[95]
Ben Fielding and Li Zhang. 2018. Evolving image classification architectures with enhanced particle swarm optimisation. IEEE Access 6 (2018), 68560–68575.
[96]
David Friede, Jovita Lukasik, Heiner Stuckenschmidt, and Margret Keuper. 2019. A variational-sequential graph autoencoder for neural architecture performance prediction. Retrieved from https://arXiv:1912.05317.
[97]
Yonggan Fu, Wuyang Chen, Haotao Wang, Haoran Li, Yingyan Lin, and Zhangyang Wang. 2020. AutoGAN-distiller: Searching to compress generative adversarial networks. Retrieved from https://arXiv:2006.08198.
[98]
Chen Gao, Yunpeng Chen, Si Liu, Zhenxiong Tan, and Shuicheng Yan. 2020. Adversarialnas: Adversarial neural architecture search for gans. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[99]
Yuan Gao, Haoping Bai, Zequn Jie, Jiayi Ma, Kui Jia, and Wei Liu. 2020. Mtl-nas: Task-agnostic neural architecture search towards general-purpose multi-task learning. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[100]
Yang Gao, Hong Yang, Peng Zhang, Chuan Zhou, and Yue Hu. 2019. Graphnas: Graph neural architecture search with reinforcement learning. Retrieved from https://arXiv:1904.09981.
[101]
Yonatan Geifman and Ran El-Yaniv. 2019. Deep active learning with a neural architecture search. In Advances in Neural Information Processing Systems. MIT Press.
[102]
Golnaz Ghiasi, Tsung-Yi Lin, and Quoc V. Le. 2019. Nas-fpn: Learning scalable feature pyramid architecture for object detection. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[103]
Ross Girshick. 2015. Fast r-cnn. In Proceedings of the International Conference on Computer Vision.
[104]
Xinyu Gong, Shiyu Chang, Yifan Jiang, and Zhangyang Wang. 2019. Autogan: Neural architecture search for generative adversarial networks. In Proceedings of the International Conference on Computer Vision.
[105]
Yunchao Gong, Liu Liu, Ming Yang, and Lubomir Bourdev. 2014. Compressing deep convolutional networks using vector quantization. Retrieved from https://arXiv:1412.6115.
[106]
Ariel Gordon, Elad Eban, Ofir Nachum, Bo Chen, Hao Wu, Tien-Ju Yang, and Edward Choi. 2018. Morphnet: Fast & simple resource-constrained structure learning of deep networks. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[107]
Sam Green, Craig M. Vineyard, Ryan Helinski, and Cetin Kaya Koç. 2019. RAPDARTS: Resource-aware progressive differentiable architecture search. Retrieved from https://arXiv:1911.05704.
[108]
Jindong Gu and Volker Tresp. 2020. Search for better students to learn distilled knowledge. Retrieved from https://arXiv:2001.11612.
[109]
Dazhou Guo, Dakai Jin, Zhuotun Zhu, Tsung-Ying Ho, Adam P. Harrison, Chun-Hung Chao, Jing Xiao, and Le Lu. 2020. Organ at risk segmentation for head and neck cancer using stratified learning and neural architecture search. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[110]
Jianyuan Guo, Kai Han, Yunhe Wang, Chao Zhang, Zhaohui Yang, Han Wu, Xinghao Chen, and Chang Xu. 2020. Hit-detector: Hierarchical trinity architecture search for object detection. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[111]
Minghao Guo, Yuzhe Yang, Rui Xu, Ziwei Liu, and Dahua Lin. 2020. When NAS meets robustness: In search of robust architectures against adversarial attacks. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[112]
Minghao Guo, Zhao Zhong, Wei Wu, Dahua Lin, and Junjie Yan. 2019. Irlas: Inverse reinforcement learning for architecture search. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[113]
Ronghao Guo, Chen Lin, Chuming Li, Keyu Tian, Ming Sun, Lu Sheng, and Junjie Yan. 2020. Powering one-shot topological NAS with stabilized share-parameter proxy. In Proceedings of the European Conference on Computer Vision.
[114]
Yong Guo, Yongsheng Luo, Zhenhao He, Jin Huang, and Jian Chen. 2020. Hierarchical neural architecture search for single image super-resolution. Retrieved from https://arXiv:2003.04619.
[115]
Yong Guo, Yin Zheng, Mingkui Tan, Qi Chen, Jian Chen, Peilin Zhao, and Junzhou Huang. 2019. Nat: Neural architecture transformer for accurate and compact architectures. In Advances in Neural Information Processing Systems. MIT Press.
[116]
Zichao Guo, Xiangyu Zhang, Haoyuan Mu, Wen Heng, Zechun Liu, Yichen Wei, and Jian Sun. 2019. Single path one-shot neural architecture search with uniform sampling. Retrieved from https://arXiv:1904.00420.
[117]
Song Han, Han Cai, Ligeng Zhu, Ji Lin, Kuan Wang, Zhijian Liu, and Yujun Lin. 2019. Design automation for efficient deep learning computing. Retrieved from https://arXiv:1904.10616.
[118]
Song Han, Huizi Mao, and William J. Dally. 2016. Deep compression: Compressing deep neural networks with pruning, trained quantization and huffman coding. In Proceedings of the International Conference on Learning Representations.
[119]
Song Han, Jeff Pool, John Tran, and William Dally. 2015. Learning both weights and connections for efficient neural network. In Advances in Neural Information Processing Systems. MIT Press.
[120]
Shayan Hassantabar, Xiaoliang Dai, and Niraj K. Jha. 2019. STEERAGE: Synthesis of neural networks using architecture search and grow-and-prune methods. Retrieved from https://arXiv:1912.05831.
[121]
Chaoyang He, Murali Annavaram, and Salman Avestimehr. 2020. FedNAS: Federated deep learning via neural architecture search. Retrieved from https://arXiv:2004.08546.
[122]
Chaoyang He, Haishan Ye, Li Shen, and Tong Zhang. 2020. Milenas: Efficient neural architecture search via mixed-level reformulation. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[123]
Kaiming He, Georgia Gkioxari, Piotr Dollár, and Ross Girshick. 2017. Mask r-cnn. In Proceedings of the International Conference on Computer Vision.
[124]
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep residual learning for image recognition. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[125]
Xin He, Kaiyong Zhao, and Xiaowen Chu. 2019. AutoML: A survey of the state-of-the-art. Retrieved from https://arXiv:1908.00709.
[126]
Geoffrey Hinton, Oriol Vinyals, and Jeff Dean. 2015. Distilling the knowledge in a neural network. Retrieved from https://arXiv:1503.02531.
[127]
Daniel Ho, Eric Liang, Xi Chen, Ion Stoica, and Pieter Abbeel. 2019. Population based augmentation: Efficient learning of augmentation policy schedules. In Proceedings of the International Conference on Machine Learning.
[128]
Kary Ho, Andrew Gilbert, Hailin Jin, and John Collomosse. 2020. Neural architecture search for deep image prior. Retrieved from https://arXiv:2001.04776.
[129]
Hyeong Gwon Hong, Pyunghwan Ahn, and Junmo Kim. 2019. EDAS: Efficient and differentiable architecture search. Retrieved from https://arXiv:1912.01237.
[130]
Andrew Howard, Mark Sandler, Grace Chu, Liang-Chieh Chen, Bo Chen, Mingxing Tan, Weijun Wang, Yukun Zhu, Ruoming Pang, Vijay Vasudevan et al. 2019. Searching for mobilenetv3. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[131]
Andrew G. Howard, Menglong Zhu, Bo Chen, Dmitry Kalenichenko, Weijun Wang, Tobias Weyand, Marco Andreetto, and Hartwig Adam. 2017. Mobilenets: Efficient convolutional neural networks for mobile vision applications. Retrieved from https://arXiv:1704.04861.
[132]
Chi-Hung Hsu, Shu-Huan Chang, Jhao-Hong Liang, Hsin-Ping Chou, Chun-Hao Liu, Shih-Chieh Chang, Jia-Yu Pan, Yu-Ting Chen, Wei Wei, and Da-Cheng Juan. 2018. Monas: Multi-objective neural architecture search using reinforcement learning. Retrieved from https://arXiv:1806.10332.
[133]
Hanzhang Hu, John Langford, Rich Caruana, Saurajit Mukherjee, Eric J. Horvitz, and Debadeepta Dey. 2019. Efficient forward architecture search. In Advances in Neural Information Processing Systems.
[134]
Jie Hu, Li Shen, and Gang Sun. 2018. Squeeze-and-excitation networks. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[135]
Yutao Hu, Xiaolong Jiang, Xuhui Liu, Baochang Zhang, Jungong Han, Xianbin Cao, and David Doermann. 2020. NAS-Count: Counting-by-density with neural architecture search. Retrieved from https://arXiv:2003.00217.
[136]
Gao Huang, Danlu Chen, Tianhong Li, Felix Wu, Laurens van der Maaten, and Kilian Q. Weinberger. 2018. Multi-scale dense networks for resource efficient image classification. In Proceedings of the International Conference on Learning Representations.
[137]
Gao Huang, Zhuang Liu, Laurens Van Der Maaten, and Kilian Q. Weinberger. 2017. Densely connected convolutional networks. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[138]
Shenyang Huang, Vincent François-Lavet, and Guillaume Rabusseau. 2019. Neural architecture search for class-incremental learning. Retrieved from https://arXiv:1909.06686.
[139]
Siyu Huang, Xi Li, Zhi-Qi Cheng, Zhongfei Zhang, and Alexander Hauptmann. 2018. Gnas: A greedy neural architecture search method for multi-attribute learning. In ACM Proceedings of the International Conference on Multimedia.
[140]
Sian-Yao Huang and Wei-Ta Chu. 2020. PONAS: Progressive one-shot neural architecture search for very efficient deployment. Retrieved from https://arXiv:2003.05112.
[141]
Zhiheng Huang and Bing Xiang. 2019. Wenet: Weighted networks for recurrent network architecture search. Retrieved from https://arXiv:1904.03819.
[142]
Andrew Hundt, Varun Jain, and Gregory D. Hager. 2019. sharpdarts: Faster and more accurate differentiable architecture search. Retrieved from https://arXiv:1903.09900.
[143]
Forrest N. Iandola, Song Han, Matthew W. Moskewicz, Khalid Ashraf, William J. Dally, and Kurt Keutzer. 2016. SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and< 0.5 MB model size. Retrieved from https://arXiv:1602.07360.
[144]
Sergey Ioffe and Christian Szegedy. 2015. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In Proceedings of the International Conference on Machine Learning.
[145]
William Irwin-Harris, Yanan Sun, Bing Xue, and Mengjie Zhang. 2019. A graph-based encoding for evolutionary convolutional neural network architecture design. In Proceedings of the IEEE Congress on Evolutionary Computation.
[146]
Roxana Istrate, Florian Scheidegger, Giovanni Mariani, Dimitrios Nikolopoulos, Costas Bekas, and A. Cristiano I. Malossi. 2019. Tapas: Train-less accuracy predictor for architecture search. In Proceedings of the AAAI Conference on Artificial Intelligence.
[147]
Yesmina Jaafra, Jean Luc Laurent, Aline Deruyver, and Mohamed Saber Naceur. 2019. Reinforcement learning for neural architecture search: A review. Image Vision Comput. 89 (2019), 57–66.
[148]
Chenhan Jiang, Shaoju Wang, Xiaodan Liang, Hang Xu, and Nong Xiao. 2020. ElixirNet: Relation-aware network architecture adaptation for medical lesion detection. In Proceedings of the AAAI Conference on Artificial Intelligence. 11093–11100.
[149]
Chenhan Jiang, Hang Xu, Wei Zhang, Xiaodan Liang, and Zhenguo Li. 2020. SP-NAS: Serial-to-parallel backbone search for object detection. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[150]
Weiwen Jiang, Xinyi Zhang, Edwin H.-M. Sha, Lei Yang, Qingfeng Zhuge, Yiyu Shi, and Jingtong Hu. 2019. Accuracy vs. efficiency: Achieving both through fpga-implementation aware neural architecture search. In Proceedings of the Annual Design Automation Conference.
[151]
Yufan Jiang, Chi Hu, Tong Xiao, Chunliang Zhang, and Jingbo Zhu. 2019. Improved differentiable architecture search for language modeling and named entity recognition. In Proceedings of the Empirical Methods in Natural Language Processing and International Joint Conference on Natural Language Processing.
[152]
Yang Jiang, Cong Zhao, Zeyang Dou, and Lei Pang. 2019. Neural architecture refinement: A practical way for avoiding overfitting in NAS. Retrieved from https://arXiv:1905.02341.
[153]
Haifeng Jin, Qingquan Song, and Xia Hu. 2018. Efficient neural architecture search with network morphism. Retrieved from https://arXiv:1806.10282.
[154]
Haifeng Jin, Qingquan Song, and Xia Hu. 2019. Auto-keras: An efficient neural architecture search system. In SIGKDD Proceedings of the International Conference on Knowledge Discovery & Data Mining.
[155]
Xiaojie Jin, Jiang Wang, Joshua Slocum, Ming-Hsuan Yang, Shengyang Dai, Shuicheng Yan, and Jiashi Feng. 2019. Rc-darts: Resource constrained differentiable architecture search. Retrieved from https://arXiv:1912.12814.
[156]
Francisco Erivaldo Fernandes Junior and Gary G. Yen. 2019. Particle swarm optimization of deep neural networks architectures for image classification. Swarm Evolution. Comput. 49 (2019), 62–74.
[157]
Purushotham Kamath, Abhishek Singh, and Debo Dutta. 2018. Neural architecture construction using envelopenets. Retrieved from https://arXiv:1803.06744.
[158]
Kirthevasan Kandasamy, Willie Neiswanger, Jeff Schneider, Barnabas Poczos, and Eric P. Xing. 2018. Neural architecture search with bayesian optimisation and optimal transport. In Advances in Neural Information Processing Systems. MIT Press.
[159]
Kirthevasan Kandasamy, Karun Raju Vysyaraju, Willie Neiswanger, Biswajit Paria, Christopher R. Collins, Jeff Schneider, Barnabas Poczos, and Eric P. Xing. 2020. Tuning hyperparameters without grad students: Scalable and robust bayesian optimisation with dragonfly. J. Mach. Learn. Res. 21, 81 (2020), 1–27.
[160]
Sapir Kaplan and Raja Giryes. 2020. Self-supervised neural architecture search. Retrieved from https://arXiv:2007.01500.
[161]
Sungwoong Kim, Ildoo Kim, Sungbin Lim, Woonhyuk Baek, Chiheon Kim, Hyungjoo Cho, Boogeon Yoon, and Taesup Kim. 2019. Scalable neural architecture search for 3D medical image segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention.
[162]
Aaron Klein and Frank Hutter. 2019. Tabular benchmarks for joint architecture and hyperparameter optimization. Retrieved from https://arXiv:1905.04970.
[163]
Nikita Klyuchnikov, Ilya Trofimov, Ekaterina Artemova, Mikhail Salnikov, Maxim Fedorov, and Evgeny Burnaev. 2020. NAS-Bench-NLP: Neural architecture search benchmark for natural language processing. Retrieved from https://arXiv:2006.07116.
[164]
Efi Kokiopoulou, Anja Hauth, Luciano Sbaiz, Andrea Gesmundo, Gabor Bartok, and Jesse Berent. 2019. Fast task-aware architecture inference. Retrieved from https://arXiv:1902.05781.
[165]
Alex Krizhevsky and Geoffrey Hinton. 2009. Learning Multiple Layers of Features from Tiny Images. Technical Report. University of Toronto.
[166]
Alex Krizhevsky, Ilya Sutskever, and Geoffrey E. Hinton. 2012. Imagenet classification with deep convolutional neural networks. In Advances in Neural Information Processing Systems. MIT Press.
[167]
Brenden M. Lake, Tomer D. Ullman, Joshua B. Tenenbaum, and Samuel J. Gershman. 2017. Building machines that learn and think like people. Behav. Brain Sci. 40 (2017).
[168]
Kevin A. Laube and Andreas Zell. 2019. Shufflenasnets: Efficient cnn models through modified efficient neural architecture search. In Proceedings of the International Joint Conference on Neural Networks.
[169]
Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. 2015. Deep learning. Nature 521, 7553 (2015), 436–444.
[170]
Heung-Chang Lee, Do-Guk Kim, and Bohyung Han. 2020. Efficient decoupled neural architecture search by structure and operation sampling. In Proceedings of the International Conference on Acoustics, Speech and Signal Processing.
[171]
Royson Lee, Łukasz Dudziak, Mohamed Abdelfattah, Stylianos I. Venieris, Hyeji Kim, Hongkai Wen, and Nicholas D. Lane. 2020. Journey towards tiny perceptual super-resolution. In Proceedings of the European Conference on Computer Vision.
[172]
Christiane Lemke, Marcin Budka, and Bogdan Gabrys. 2015. Metalearning: A survey of trends and technologies. Artific. Intell. Rev. 44, 1 (2015), 117–130.
[173]
Guohao Li, Guocheng Qian, Itzel C. Delgadillo, Matthias Muller, Ali Thabet, and Bernard Ghanem. 2020. Sgas: Sequential greedy architecture search. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[174]
Guilin Li, Xing Zhang, Zitong Wang, Zhenguo Li, and Tong Zhang. 2019. StacNAS: Towards stable and consistent optimization for differentiable neural architecture search. Retrieved from https://arXiv:1909.11926.
[175]
Hao Li, Hong Zhang, Xiaojuan Qi, Ruigang Yang, and Gao Huang. 2019. Improved techniques for training adaptive deep networks. In Proceedings of the International Conference on Computer Vision.
[176]
Jixiang Li, Chuming Liang, Bo Zhang, Zhao Wang, Fei Xiang, and Xiangxiang Chu. 2019. Neural architecture search on acoustic scene classification. Retrieved from https://arXiv:1912.12825.
[177]
Liam Li and Ameet Talwalkar. 2019. Random search and reproducibility for neural architecture search. Retrieved from https://arXiv:1902.07638.
[178]
Muyang Li, Ji Lin, Yaoyao Ding, Zhijian Liu, Jun-Yan Zhu, and Song Han. 2020. Gan compression: Efficient architectures for interactive conditional gans. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[179]
Ruoteng Li, Robby T. Tan, and Loong-Fah Cheong. 2020. All in one bad weather removal using architectural search. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[180]
Ting Li, Junbo Zhang, Kainan Bao, Yuxuan Liang, Yexin Li, and Yu Zheng. 2020. Autost: Efficient neural architecture search for spatio-temporal prediction. In Proceedings of the Conference on Knowledge Discovery and Data Mining.
[181]
Xin Li, Yiming Zhou, Zheng Pan, and Jiashi Feng. 2019. Partial order pruning: for best speed/accuracy trade-off in neural architecture search. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[182]
Yinqiao Li, Chi Hu, Yuhao Zhang, Nuo Xu, Yufan Jiang, Tong Xiao, Jingbo Zhu, Tongran Liu, and Changliang Li. 2020. Learning architectures from an extended search space for language modeling. Retrieved from https://arXiv:2005.02593.
[183]
Yingwei Li, Xiaojie Jin, Jieru Mei, Xiaochen Lian, Linjie Yang, Cihang Xie, Qihang Yu, Yuyin Zhou, Song Bai, and Alan L. Yuille. 2020. Neural architecture search for lightweight non-local networks. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[184]
Yaoman Li and Irwin King. 2019. Architecture search for image inpainting. In Proceedings of the International Symposium on Neural Networks.
[185]
Yanwei Li, Lin Song, Yukang Chen, Zeming Li, Xiangyu Zhang, Xingang Wang, and Jian Sun. 2020. Learning dynamic routing for semantic segmentation. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[186]
Hanwen Liang, Shifeng Zhang, Jiacheng Sun, Xingqiu He, Weiran Huang, Kechen Zhuang, and Zhenguo Li. 2019. Darts+: Improved differentiable architecture search with early stopping. Retrieved from https://arXiv:1909.06035.
[187]
Jason Liang, Elliot Meyerson, and Risto Miikkulainen. 2018. Evolutionary architecture search for deep multitask networks. In Proceedings of the Genetic and Evolutionary Computation Conference.
[188]
Sungbin Lim, Ildoo Kim, Taesup Kim, Chiheon Kim, and Sungwoong Kim. 2019. Fast autoaugment. In Advances in Neural Information Processing Systems. MIT Press.
[189]
Peiwen Lin, Peng Sun, Guangliang Cheng, Sirui Xie, Xi Li, and Jianping Shi. 2020. Graph-guided architecture search for real-time semantic segmentation. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[190]
Marius Lindauer and Frank Hutter. 2019. Best practices for scientific research on neural architecture search. Retrieved from https://arXiv:1909.02453.
[191]
Chenxi Liu, Liang-Chieh Chen, Florian Schroff, Hartwig Adam, Wei Hua, Alan L. Yuille, and Li Fei-Fei. 2019. Auto-deeplab: Hierarchical neural architecture search for semantic image segmentation. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[192]
Chenxi Liu, Piotr Dollár, Kaiming He, Ross Girshick, Alan Yuille, and Saining Xie. 2020. Are labels necessary for neural architecture search? In Proceedings of the European Conference on Computer Vision.
[193]
Chenxi Liu, Barret Zoph, Maxim Neumann, Jonathon Shlens, Wei Hua, Li-Jia Li, Li Fei-Fei, Alan Yuille, Jonathan Huang, and Kevin Murphy. 2018. Progressive neural architecture search. In Proceedings of the European Conference on Computer Vision.
[194]
Hanxiao Liu, Karen Simonyan, Oriol Vinyals, Chrisantha Fernando, and Koray Kavukcuoglu. 2018. Hierarchical representations for efficient architecture search. In Proceedings of the International Conference on Learning Representations.
[195]
Hanxiao Liu, Karen Simonyan, and Yiming Yang. 2019. Darts: Differentiable architecture search. In Proceedings of the International Conference on Learning Representations.
[196]
Lanlan Liu and Jia Deng. 2018. Dynamic deep neural networks: Optimizing accuracy-efficiency trade-offs by selective execution. In Proceedings of the AAAI Conference on Artificial Intelligence.
[197]
Peng Liu, Mohammad D. El Basha, Yangjunyi Li, Yao Xiao, Pina C. Sanelli, and Ruogu Fang. 2019. Deep evolutionary networks with expedited genetic algorithms for medical image denoising. Med. Image Anal. 54 (2019), 306–315.
[198]
Peiye Liu, Bo Wu, Huadong Ma, Pavan Kumar Chundi, and Mingoo Seok. 2019. MemNet: Memory-efficiency guided neural architecture search with augment-trim learning. Retrieved from https://arXiv:1907.09569.
[199]
Wei Liu, Dragomir Anguelov, Dumitru Erhan, Christian Szegedy, Scott Reed, Cheng-Yang Fu, and Alexander C. Berg. 2016. SSD: Single shot multibox detector. In Proceedings of the European Conference on Computer Vision.
[200]
Yu Liu, Xuhui Jia, Mingxing Tan, Raviteja Vemulapalli, Yukun Zhu, Bradley Green, and Xiaogang Wang. 2020. Search to distill: Pearls are everywhere but not the eyes. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[201]
Yifan Liu, Bohan Zhuang, Chunhua Shen, Hao Chen, and Wei Yin. 2019. Training compact neural networks via auxiliary overparameterization. Retrieved from https://arXiv:1909.02214.
[202]
Jonathan Long, Evan Shelhamer, and Trevor Darrell. 2015. Fully convolutional networks for semantic segmentation. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[203]
Pablo Ribalta Lorenzo and Jakub Nalepa. 2018. Memetic evolution of deep neural networks. In Proceedings of the Genetic and Evolutionary Computation Conference.
[204]
Ilya Loshchilov and Frank Hutter. 2017. Sgdr: Stochastic gradient descent with warm restarts. In Proceedings of the International Conference on Learning Representations.
[205]
Ilya Loshchilov and Frank Hutter. 2019. Decoupled weight decay regularization. In Proceedings of the International Conference on Learning Representations.
[206]
David G. Lowe. 2004. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vision 60, 2 (2004), 91–110.
[207]
Zhichao Lu, Gautam Sreekumar, Erik Goodman, Wolfgang Banzhaf, Kalyanmoy Deb, and Vishnu Naresh Boddeti. 2020. Neural architecture transfer. Retrieved from https://arXiv:2005.05859.
[208]
Zhichao Lu, Ian Whalen, Vishnu Boddeti, Yashesh Dhebar, Kalyanmoy Deb, Erik Goodman, and Wolfgang Banzhaf. 2019. NSGA-Net: Neural architecture search using multi-objective genetic algorithm. In Proceedings of the Genetic and Evolutionary Computation Conference.
[209]
Renqian Luo, Tao Qin, and Enhong Chen. 2019. Understanding and improving one-shot neural architecture optimization. Retrieved from https://arXiv:1909.10815.
[210]
Renqian Luo, Xu Tan, Rui Wang, Tao Qin, Enhong Chen, and Tie-Yan Liu. 2020. Semi-Supervised neural architecture search. Retrieved from https://arXiv:2002.10389.
[211]
Renqian Luo, Fei Tian, Tao Qin, Enhong Chen, and Tie-Yan Liu. 2018. Neural architecture optimization. In Advances in Neural Information Processing Systems. MIT Press.
[212]
Li Lyna Zhang, Yuqing Yang, Yuhang Jiang, Wenwu Zhu, and Yunxin Liu. 2020. Fast Hardware-aware neural architecture search. In Proceedings of the Conference on Computer Vision and Pattern Recognition Workshops.
[213]
Lizheng Ma, Jiaxu Cui, and Bo Yang. 2019. Deep neural architecture search with deep graph Bayesian optimization. In Proceedings of the International Conference on Web Intelligence.
[214]
Ningning Ma, Xiangyu Zhang, Hai-Tao Zheng, and Jian Sun. 2018. Shufflenet v2: Practical guidelines for efficient cnn architecture design. In Proceedings of the European Conference on Computer Vision.
[215]
Vladimir Macko, Charles Weill, Hanna Mazzawi, and Javier Gonzalvo. 2019. Improving neural architecture search image classifiers via ensemble learning. Retrieved from https://arXiv:1903.06236.
[216]
Marcus Märtens and Dario Izzo. 2019. Neural network architecture search with differentiable cartesian genetic programming for regression. In Proceedings of the Genetic and Evolutionary Computation Conference.
[217]
Krzysztof Maziarz, Mingxing Tan, Andrey Khorlin, Kuang-Yu Samuel Chang, and Andrea Gesmundo. 2018. Evolutionary-neural hybrid agents for architecture search. Retrieved from https://arXiv:1811.09828.
[218]
Hanna Mazzawi, Xavi Gonzalvo, Aleks Kracun, Prashant Sridhar, Niranjan Subrahmanya, Ignacio Lopez-Moreno, Hyun-Jin Park, and Patrick Violette. 2019. Improving keyword spotting and language identification via neural architecture search at scale. In Proceedings of the Conference of the International Speech Communication Association (INTERSPEECH’19).
[219]
Mason McGill and Pietro Perona. 2017. Deciding how to decide: Dynamic routing in artificial neural networks. Retrieved from https://arXiv:1703.06217.
[220]
Jieru Mei, Yingwei Li, Xiaochen Lian, Xiaojie Jin, Linjie Yang, Alan Yuille, and Jianchao Yang. 2020. Atomnas: Fine-grained end-to-end neural architecture search. In Proceedings of the International Conference on Learning Representations.
[221]
Hector Mendoza, Aaron Klein, Matthias Feurer, Jost Tobias Springenberg, and Frank Hutter. 2016. Towards automatically tuned neural networks. In Proceedings of the Workshop on Automatic Machine Learning.
[222]
Rang Meng, Weijie Chen, Di Xie, Yuan Zhang, and Shiliang Pu. 2020. Neural inheritance relation guided one-shot layer assignment search. Retrieved from https://arXiv:2002.12580.
[223]
Stephen Merity, Caiming Xiong, James Bradbury, and Richard Socher. 2016. Pointer sentinel mixture models. Retrieved from https://arXiv:1609.07843.
[224]
Risto Miikkulainen, Jason Liang, Elliot Meyerson, Aditya Rawal, Daniel Fink, Olivier Francon, Bala Raju, Hormoz Shahrzad, Arshak Navruzyan, Nigel Duffy et al. 2019. Evolving deep neural networks. In Proceedings of the Conference on Artificial Intelligence in the Age of Neural Networks and Brain Computing. 293–312.
[225]
Aliasghar Mortazi and Ulas Bagci. 2018. Automatically designing CNN architectures for medical image segmentation. In Proceedings of the International Workshop on Machine Learning in Medical Imaging.
[226]
Marcin Mozejko, Tomasz Latkowski, Lukasz Treszczotko, Michal Szafraniuk, and Krzysztof Trojanowski. 2020. Superkernel neural architecture search for image denoising. In Proceedings of the Conference on Computer Vision and Pattern Recognition Workshops.
[227]
Norman Mu, Zhewei Yao, Amir Gholami, Kurt Keutzer, and Michael Mahoney. 2018. Parameter re-initialization through cyclical batch size schedules. Retrieved from https://arXiv:1812.01216.
[228]
Vinod Nair and Geoffrey E. Hinton. 2010. Rectified linear units improve restricted boltzmann machines. In Proceedings of the International Conference on Machine Learning.
[229]
Niv Nayman, Asaf Noy, Tal Ridnik, Itamar Friedman, Rong Jin, and Lihi Zelnik. 2019. Xnas: Neural architecture search with expert advice. In Advances in Neural Information Processing Systems. MIT Press.
[230]
Renato Negrinho and Geoff Gordon. 2017. Deeparchitect: Automatically designing and training deep architectures. Retrieved from https://arXiv:1704.08792.
[231]
Renato Negrinho, Matthew Gormley, Geoffrey J. Gordon, Darshan Patil, Nghia Le, and Daniel Ferreira. 2019. Towards modular and programmable architecture search. In Advances in Neural Information Processing Systems. MIT Press.
[232]
Vladimir Nekrasov, Hao Chen, Chunhua Shen, and Ian Reid. 2019. Fast neural architecture search of compact semantic segmentation models via auxiliary cells. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[233]
Vladimir Nekrasov, Hao Chen, Chunhua Shen, and Ian Reid. 2020. Architecture search of dynamic cells for semantic video segmentation. In Proceedings of the Winter Conference on Applications of Computer Vision.
[234]
Vladimir Nekrasov, Chunhua Shen, and Ian Reid. 2020. Template-based automatic search of compact semantic segmentation architectures. In Proceedings of the Winter Conference on Applications of Computer Vision.
[235]
Kien Nguyen, Clinton Fookes, and Sridha Sridharan. 2020. Constrained design of deep iris networks. IEEE Trans. Image Process. 29 (2020), 7166–7175.
[236]
Xuefei Ning, Yin Zheng, Tianchen Zhao, Yu Wang, and Huazhong Yang. 2020. A generic graph-based neural architecture encoding scheme for predictor-based NAS. In Proceedings of the European Conference on Computer Vision.
[237]
Asaf Noy, Niv Nayman, Tal Ridnik, Nadav Zamir, Sivan Doveh, Itamar Friedman, Raja Giryes, and Lihi Zelnik. 2020. Asap: Architecture search, anneal and prune. In Proceedings of the International Conference on Artificial Intelligence and Statistics.
[238]
Alexander Ororbia, AbdElRahman ElSaid, and Travis Desell. 2019. Investigating recurrent neural network memory structures using neuro-evolution. In Proceedings of the Genetic and Evolutionary Computation Conference.
[239]
T. Den Ottelander, Arkadiy Dushatskiy, Marco Virgolin, and Peter A. N. Bosman. 2020. Local search is a remarkably strong baseline for neural architecture search. Retrieved from https://arXiv:2004.08996.
[240]
Rameswar Panda, Michele Merler, Mayoore Jaiswal, Hui Wu, Kandan Ramakrishnan, Ulrich Finkler, Chun-Fu Chen, Minsik Cho, David Kung, Rogerio Feris et al. 2020. NASTransfer: Analyzing architecture transferability in large scale neural architecture search. Retrieved from https://arXiv:2006.13314.
[241]
Ramakanth Pasunuru and Mohit Bansal. 2019. Continual and multi-task architecture search. In Proceedings of the Annual Meeting of the Association for Computational Linguistics.
[242]
Wei Peng, Xiaopeng Hong, and Guoying Zhao. 2019. Video action recognition via neural architecture searching. In Proceedings of the International Conference on Image Processing.
[243]
Juan-Manuel Pérez-Rúa, Moez Baccouche, and Stephane Pateux. 2018. Efficient progressive neural architecture search. Retrieved from https://arXiv:1808.00391.
[244]
Juan-Manuel Pérez-Rúa, Valentin Vielzeuf, Stéphane Pateux, Moez Baccouche, and Frédéric Jurie. 2019. Mfas: Multimodal fusion architecture search. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[245]
Hieu Pham, Melody Y. Guan, Barret Zoph, Quoc V. Le, and Jeff Dean. 2018. Efficient neural architecture search via parameter sharing. In Proceedings of the International Conference on Machine Learning.
[246]
A. J. Piergiovanni, Anelia Angelova, Alexander Toshev, and Michael S. Ryoo. 2019. Evolving space-time neural architectures for videos. In Proceedings of the International Conference on Computer Vision.
[247]
Jonas Prellberg and Oliver Kramer. 2018. Lamarckian evolution of convolutional neural networks. In Proceedings of the International Conference on Parallel Problem Solving from Nature.
[248]
Zhaofan Qiu, Ting Yao, Yiheng Zhang, Yongdong Zhang, and Tao Mei. 2019. Scheduled differentiable architecture search for visual recognition. Retrieved from https://arXiv:1909.10236.
[249]
Ruijie Quan, Xuanyi Dong, Yu Wu, Linchao Zhu, and Yi Yang. 2019. Auto-reid: Searching for a part-aware convnet for person re-identification. In Proceedings of the International Conference on Computer Vision.
[250]
Ilija Radosavovic, Justin Johnson, Saining Xie, Wan-Yen Lo, and Piotr Dollár. 2019. On network design spaces for visual recognition. In Proceedings of the International Conference on Computer Vision.
[251]
Ilija Radosavovic, Raj Prateek Kosaraju, Ross Girshick, Kaiming He, and Piotr Dollár. 2020. Designing network design spaces. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[252]
Prajit Ramachandran, Barret Zoph, and Quoc V. Le. 2017. Searching for activation functions. Retrieved from https://arXiv:1710.05941.
[253]
Mohammad Rastegari, Vicente Ordonez, Joseph Redmon, and Ali Farhadi. 2016. Xnor-net: Imagenet classification using binary convolutional neural networks. In Proceedings of the European Conference on Computer Vision.
[254]
Aditya Rawal and Risto Miikkulainen. 2018. From nodes to networks: Evolving recurrent neural networks. Retrieved from https://arXiv:1803.04439.
[255]
Esteban Real, Alok Aggarwal, Yanping Huang, and Quoc V. Le. 2019. Regularized evolution for image classifier architecture search. In Proceedings of the AAAI Conference on Artificial Intelligence.
[256]
Esteban Real, Chen Liang, David R. So, and Quoc V. Le. 2020. AutoML-Zero: Evolving machine learning algorithms from scratch. Retrieved from https://arXiv:2003.03384.
[257]
Esteban Real, Sherry Moore, Andrew Selle, Saurabh Saxena, Yutaka Leon Suematsu, Jie Tan, Quoc V. Le, and Alexey Kurakin. 2017. Large-scale evolution of image classifiers. In Proceedings of the International Conference on Machine Learning.
[258]
Joseph Redmon, Santosh Divvala, Ross Girshick, and Ali Farhadi. 2016. You only look once: Unified, real-time object detection. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[259]
Pengzhen Ren, Yun Xiao, Xiaojun Chang, Po-Yao Huang, Zhihui Li, Xiaojiang Chen, and Xin Wang. 2020. A comprehensive survey of neural architecture search: Challenges and solutions. Retrieved from https://arXiv:2006.02903.
[260]
Shaoqing Ren, Kaiming He, Ross Girshick, and Jian Sun. 2015. Faster r-cnn: Towards real-time object detection with region proposal networks. In Advances in Neural Information Processing Systems. MIT Press.
[261]
J. Gomez Robles and Joaquin Vanschoren. 2019. Learning to reinforcement learn for neural architecture search. Retrieved from https://arXiv:1911.03769.
[262]
Raanan Y. Rohekar, Shami Nisimov, Yaniv Gurwicz, Guy Koren, and Gal Novik. 2018. Constructing deep neural networks by bayesian network structure learning. In Advances in Neural Information Processing Systems. MIT Press.
[263]
Olga Russakovsky, Jia Deng, Hao Su, Jonathan Krause, Sanjeev Satheesh, Sean Ma, Zhiheng Huang, Andrej Karpathy, Aditya Khosla, Michael Bernstein et al. 2015. Imagenet large scale visual recognition challenge. Int. J. Comput. Vision 115, 3 (2015), 211–252.
[264]
Michael S. Ryoo, A. J. Piergiovanni, Mingxing Tan, and Anelia Angelova. 2020. Assemblenet: Searching for multi-stream neural connectivity in video architectures. In Proceedings of the International Conference on Learning Representations.
[265]
Tonmoy Saikia, Yassine Marrakchi, Arber Zela, Frank Hutter, and Thomas Brox. 2019. Autodispnet: Improving disparity estimation with automl. In Proceedings of the International Conference on Computer Vision.
[266]
Cristiano Saltori, Subhankar Roy, Nicu Sebe, and Giovanni Iacca. 2019. Regularized evolutionary algorithm for dynamic neural topology search. In Proceedings of the International Conference on Image Analysis and Processing.
[267]
Pedro Savarese and Michael Maire. 2019. Learning implicitly recurrent CNNs through parameter sharing. Retrieved from https://arXiv:1902.09701.
[268]
Shreyas Saxena and Jakob Verbeek. 2016. Convolutional neural fabrics. In Advances in Neural Information Processing Systems. MIT Press.
[269]
Tom Schaul and Jürgen Schmidhuber. 2010. Metalearning. Scholarpedia 5, 6 (2010), 4650.
[270]
Jürgen Schmidhuber. 1987. Evolutionary Principles in Self-referential Learning, or on Learning How to Learn: The Meta-meta-... Hook. Ph.D. Dissertation. Technische Universität München.
[271]
Martin Schrimpf, Stephen Merity, James Bradbury, and Richard Socher. 2017. A flexible approach to automated rnn architecture generation. Retrieved from https://arXiv:1712.07316.
[272]
Christian Sciuto, Kaicheng Yu, Martin Jaggi, Claudiu Musat, and Mathieu Salzmann. 2019. Evaluating the search phase of neural architecture search. Retrieved from https://arXiv:1902.08142.
[273]
Albert Shaw, Daniel Hunter, Forrest Landola, and Sammy Sidhu. 2019. Squeezenas: Fast neural architecture search for faster semantic segmentation. In Proceedings of the International Conference on Computer Vision workshops.
[274]
Albert Shaw, Wei Wei, Weiyang Liu, Le Song, and Bo Dai. 2019. Meta architecture search. In Advances in Neural Information Processing Systems. MIT Press.
[275]
Mingzhu Shen, Kai Han, Chunjing Xu, and Yunhe Wang. 2019. Searching for accurate binary neural architectures. In Proceedings of the International Conference on Computer Vision Workshops.
[276]
Han Shi, Renjie Pi, Hang Xu, Zhenguo Li, James T. Kwok, and Tong Zhang. 2019. Multi-objective neural architecture search via predictive network performance optimization. Retrieved from https://arXiv:1911.09336.
[277]
Richard Shin, Charles Packer, and Dawn Song. 2018. Differentiable neural network architecture search. In Proceedings of the International Conference on Learning Representations: Workshops.
[278]
Connor Shorten and Taghi M. Khoshgoftaar. 2019. A survey on image data augmentation for deep learning. J. Big Data 6, 1 (2019), 60.
[279]
Karen Simonyan and Andrew Zisserman. 2015. Very deep convolutional networks for large-scale image recognition. In Proceedings of the International Conference on Learning Representations.
[280]
Sean C. Smithson, Guang Yang, Warren J. Gross, and Brett H. Meyer. 2016. Neural networks designing neural networks: Multi-objective hyper-parameter optimization. In Proceedings of the International Conference on Computer-Aided Design.
[281]
Jasper Snoek, Hugo Larochelle, and Ryan P. Adams. 2012. Practical bayesian optimization of machine learning algorithms. In Advances in Neural Information Processing Systems. MIT Press, 2951–2959.
[282]
David So, Quoc Le, and Chen Liang. 2019. The evolved transformer. In Proceedings of the International Conference on Machine Learning.
[283]
Dehua Song, Chang Xu, Xu Jia, Yiyi Chen, Chunjing Xu, and Yunhe Wang. 2020. Efficient residual dense block search for image super-resolution. In Proceedings of the AAAI Conference on Artificial Intelligence.
[284]
Qingquan Song, Dehua Cheng, Hanning Zhou, Jiyan Yang, Yuandong Tian, and Xia Hu. 2020. Towards automated neural interaction discovery for click-through rate prediction. Retrieved from https://arXiv:2007.06434.
[285]
Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov. 2014. Dropout: A simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15, 1 (2014), 1929–1958.
[286]
Dimitrios Stamoulis, Ruizhou Ding, Di Wang, Dimitrios Lymberopoulos, Bodhi Priyantha, Jie Liu, and Diana Marculescu. 2019. Single-path nas: Designing hardware-efficient convnets in less than 4 hours. In Proceedings of the Joint European Conference on Machine Learning and Knowledge Discovery in Databases.
[287]
Kenneth O. Stanley, Jeff Clune, Joel Lehman, and Risto Miikkulainen. 2019. Designing neural networks through neuroevolution. Nature Mach. Intell. 1, 1 (2019), 24–35.
[288]
Kenneth O. Stanley and Risto Miikkulainen. 2002. Evolving neural networks through augmenting topologies. Evolution. Comput. 10, 2 (2002), 99–127.
[289]
Felipe Petroski Such, Aditya Rawal, Joel Lehman, Kenneth O. Stanley, and Jeff Clune. 2019. Generative teaching networks: Accelerating neural architecture search by learning to generate synthetic training data. Retrieved from https://arXiv:1912.07768.
[290]
Masanori Suganuma, Shinichi Shirakawa, and Tomoharu Nagao. 2017. A genetic programming approach to designing convolutional neural network architectures. In Proceedings of the Genetic and Evolutionary Computation Conference.
[291]
Ke Sun, Bin Xiao, Dong Liu, and Jingdong Wang. 2019. Deep high-resolution representation learning for human pose estimation. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[292]
Li Sun, Xiaoyi Yu, Liuan Wang, Jun Sun, Hiroya Inakoshi, Ken Kobayashi, and Hiromichi Kobashi. 2019. Automatic neural network search method for open set recognition. In Proceedings of the International Conference on Image Processing.
[293]
Yanan Sun, Handing Wang, Bing Xue, Yaochu Jin, Gary G. Yen, and Mengjie Zhang. 2019. Surrogate-assisted evolutionary deep learning using an end-to-end random forest-based performance predictor. IEEE Trans. Evolution. Comput. 24, 2 (2019), 350–364.
[294]
Yanan Sun, Bing Xue, Mengjie Zhang, and Gary G. Yen. 2018. Automatically evolving CNN architectures based on blocks. Retrieved from https://arXiv:1810.11875.
[295]
Yanan Sun, Bing Xue, Mengjie Zhang, and Gary G. Yen. 2018. A particle swarm optimization-based flexible convolutional autoencoder for image classification. IEEE Trans. Neural Netw. Learn. Syst. 30, 8 (2018), 2295–2309.
[296]
Yanan Sun, Bing Xue, Mengjie Zhang, and Gary G. Yen. 2019. Evolving deep convolutional neural networks for image classification. IEEE Trans. Evolution. Comput. 23, 1 (2019), 89–103.
[297]
Yanan Sun, Gary G. Yen, and Zhang Yi. 2018. Evolving unsupervised deep neural networks for learning meaningful representations. IEEE Trans. Evolution. Comput. 23, 1 (2018), 89–103.
[298]
Ilya Sutskever, James Martens, George Dahl, and Geoffrey Hinton. 2013. On the importance of initialization and momentum in deep learning. In Proceedings of the International Conference on Machine Learning.
[299]
Christian Szegedy, Wei Liu, Yangqing Jia, Pierre Sermanet, Scott Reed, Dragomir Anguelov, Dumitru Erhan, Vincent Vanhoucke, and Andrew Rabinovich. 2015. Going deeper with convolutions. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[300]
El-Ghazali Talbi. 2020. Optimization of deep neural networks: A survey and unified taxonomy. Technical Report. University of Lille and INRIA.
[301]
Mingxing Tan, Bo Chen, Ruoming Pang, Vijay Vasudevan, Mark Sandler, Andrew Howard, and Quoc V. Le. 2019. Mnasnet: Platform-aware neural architecture search for mobile. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[302]
Mingxing Tan and Quoc V. Le. 2019. EfficientNet: Rethinking model scaling for convolutional neural networks. In Proceedings of the International Conference on Machine Learning.
[303]
Mingxing Tan and Quoc V. Le. 2019. Mixconv: Mixed depthwise convolutional kernels. In Proceedings of the British Machine Vision Conference.
[304]
Mingxing Tan, Ruoming Pang, and Quoc V. Le. 2020. Efficientdet: Scalable and efficient object detection. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[305]
Chris Thornton, Frank Hutter, Holger H. Hoos, and Kevin Leyton-Brown. 2013. Auto-WEKA: Combined selection and hyperparameter optimization of classification algorithms. In Proceedings of the International Conference on Knowledge Discovery and Data Mining (SIGKDD’13).
[306]
Yunjie Tian, Chang Liu, Lingxi Xie, Jianbin Jiao, and Qixiang Ye. 2020. Discretization-aware architecture search. Retrieved from https://arXiv:2007.03154.
[307]
Yuesong Tian, Li Shen, Guinan Su, Zhifeng Li, and Wei Liu. 2020. AlphaGAN: Fully differentiable architecture search for generative adversarial networks. Retrieved from https://arXiv:2006.09134.
[308]
Yuan Tian, Qin Wang, Zhiwu Huang, Wen Li, Dengxin Dai, Minghao Yang, Jun Wang, and Olga Fink. 2020. Off-policy reinforcement learning for efficient and effective GAN architecture search. Retrieved from https://arXiv:2007.09180.
[309]
Alexander Toshev and Christian Szegedy. 2014. Deeppose: Human pose estimation via deep neural networks. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[310]
Ilya Trofimov, Nikita Klyuchnikov, Mikhail Salnikov, Alexander Filippov, and Evgeny Burnaev. 2020. Multi-fidelity neural architecture search with knowledge distillation. Retrieved from https://arXiv:2006.08341.
[311]
Gerard Jacques van Wyk and Anna Sergeevna Bosman. 2019. Evolutionary neural architecture search for image restoration. In Proceedings of the International Joint Conference on Neural Networks.
[312]
Joaquin Vanschoren. 2018. Meta-learning: A survey. Retrieved from https://arXiv:1810.03548.
[313]
Danilo Vasconcellos Vargas and Shashank Kotyan. 2019. Evolving robust neural architectures to defend from adversarial attacks. Retrieved from https://arXiv:1906.11667.
[314]
Tom Véniat, Olivier Schwander, and Ludovic Denoyer. 2019. Stochastic adaptive neural architecture search for keyword spotting. In Proceedings of the International Conference on Acoustics, Speech and Signal Processing.
[315]
Oriol Vinyals, Charles Blundell, Timothy Lillicrap, Daan Wierstra et al. 2016. Matching networks for one shot learning. In Advances in Neural Information Processing Systems. MIT Press.
[316]
Bin Wang, Yanan Sun, Bing Xue, and Mengjie Zhang. 2018. Evolving deep convolutional neural networks by variable-length particle swarm optimization for image classification. In Proceedings of the IEEE Congress on Evolutionary Computation.
[317]
Bin Wang, Yanan Sun, Bing Xue, and Mengjie Zhang. 2018. A hybrid differential evolution approach to designing deep convolutional neural networks for image classification. In Proceedings of the Australasian Joint Conference on Artificial Intelligence.
[318]
Bin Wang, Yanan Sun, Bing Xue, and Mengjie Zhang. 2019. A hybrid GA-PSO method for evolving architecture and short connections of deep convolutional neural networks. In Proceedings of the Pacific Rim International Conference on Artificial Intelligence.
[319]
Bin Wang, Bing Xue, and Mengjie Zhang. 2019. Particle swarm optimisation for evolving deep neural networks for image classification by evolving and stacking transferable blocks. Retrieved from https://arXiv:1907.12659.
[320]
Hanchao Wang and Jun Huan. 2019. Agan: Towards automated design of generative adversarial networks. Retrieved from https://arXiv:1906.11080.
[321]
Jingdong Wang, Ke Sun, Tianheng Cheng, Borui Jiang, Chaorui Deng, Yang Zhao, Dong Liu, Yadong Mu, Mingkui Tan, Xinggang Wang et al. 2020. Deep high-resolution representation learning for visual recognition. IEEE Trans. Pattern Anal. Mach. Intell. (2020).
[322]
Jingdong Wang, Zhen Wei, Ting Zhang, and Wenjun Zeng. 2016. Deeply fused nets. Retrieved from https://arXiv:1605.07716.
[323]
Lanfei Wang, Lingxi Xie, Tianyi Zhang, Jun Guo, and Qi Tian. 2019. Scalable NAS with factorizable architectural parameters. Retrieved from https://arXiv:1912.13256.
[324]
Linnan Wang, Saining Xie, Teng Li, Rodrigo Fonseca, and Yuandong Tian. 2019. Sample-efficient neural architecture search by learning action space. Retrieved from https://arXiv:1906.06832.
[325]
Linnan Wang, Yiyang Zhao, Yuu Jinnai, Yuandong Tian, and Rodrigo Fonseca. 2019. Alphax: Exploring neural architectures with deep neural networks and monte carlo tree search. Retrieved from https://arXiv:1903.11059.
[326]
Yujing Wang, Yaming Yang, Yiren Chen, Jing Bai, Ce Zhang, Guinan Su, Xiaoyu Kou, Yunhai Tong, Mao Yang, and Lidong Zhou. 2020. TextNAS: A neural architecture search space tailored for text representation. In Proceedings of the AAAI Conference on Artificial Intelligence.
[327]
Longhui Wei, An Xiao, Lingxi Xie, Xin Chen, Xiaopeng Zhang, and Qi Tian. 2020. Circumventing outliers of autoaugment with knowledge distillation. In Proceedings of the European Conference on Computer Vision.
[328]
Tao Wei, Changhu Wang, and Chang Wen Chen. 2017. Modularized morphing of neural networks. Retrieved from https://arXiv:1701.03281.
[329]
Tao Wei, Changhu Wang, Yong Rui, and Chang Wen Chen. 2016. Network morphism. In Proceedings of the International Conference on Machine Learning.
[330]
Yu Weng, Tianbao Zhou, Yujie Li, and Xiaoyu Qiu. 2019. Nas-unet: Neural architecture search for medical image segmentation. IEEE Access 7 (2019), 44247–44257.
[331]
Yu Weng, Tianbao Zhou, Lei Liu, and Chunlei Xia. 2019. Automatic convolutional neural architecture search for image classification under different scenes. IEEE Access 7 (2019), 38495–38506.
[332]
Colin White, Willie Neiswanger, Sam Nolen, and Yash Savani. 2020. A study on encodings for neural architecture search. Retrieved from https://arXiv:2007.04965.
[333]
Colin White, Willie Neiswanger, and Yash Savani. 2019. Bananas: Bayesian optimization with neural architectures for neural architecture search. Retrieved from https://arXiv:1910.11858.
[334]
Martin Wistuba. 2017. Finding competitive network architectures within a day using UCT. Retrieved from https://arXiv:1712.07420.
[335]
Martin Wistuba. 2018. Deep learning architecture search by neuro-cell-based evolution with function-preserving mutations. In Proceedings of the Joint European Conference on Machine Learning and Knowledge Discovery in Databases.
[336]
Martin Wistuba and Tejaswini Pedapati. 2019. Inductive transfer for neural architecture optimization. Retrieved from https://arXiv:1903.03536.
[337]
Martin Wistuba, Ambrish Rawat, and Tejaswini Pedapati. 2019. A survey on neural architecture search. Retrieved from https://arXiv:1905.01392.
[338]
Alexander Wong, Zhong Qiu Lin, and Brendan Chwyl. 2019. Attonets: Compact and efficient deep neural networks for the edge via human-machine collaborative design. In Proceedings of the Conference on Computer Vision and Pattern Recognition Workshops.
[339]
Catherine Wong, Neil Houlsby, Yifeng Lu, and Andrea Gesmundo. 2018. Transfer learning with neural automl. In Advances in Neural Information Processing Systems. MIT Press.
[340]
Ken C. L. Wong and Mehdi Moradi. 2019. SegNAS3D: Network architecture search with derivative-free global optimization for 3D image segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention.
[341]
Bichen Wu, Xiaoliang Dai, Peizhao Zhang, Yanghan Wang, Fei Sun, Yiming Wu, Yuandong Tian, Peter Vajda, Yangqing Jia, and Kurt Keutzer. 2019. Fbnet: Hardware-aware efficient convnet design via differentiable neural architecture search. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[342]
Bichen Wu, Yanghan Wang, Peizhao Zhang, Yuandong Tian, Peter Vajda, and Kurt Keutzer. 2018. Mixed precision quantization of convnets via differentiable neural architecture search. Retrieved from https://arXiv:1812.00090.
[343]
Huikai Wu, Junge Zhang, and Kaiqi Huang. 2019. SparseMask: Differentiable connectivity learning for dense image prediction. In Proceedings of the International Conference on Computer Vision.
[344]
Zuxuan Wu, Tushar Nagarajan, Abhishek Kumar, Steven Rennie, Larry S. Davis, Kristen Grauman, and Rogerio Feris. 2018. Blockdrop: Dynamic inference paths in residual networks. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[345]
Xia Xiao, Zigeng Wang, and Sanguthevar Rajasekaran. 2019. Autoprune: Automatic network pruning by regularizing auxiliary parameters. In Advances in Neural Information Processing Systems. MIT Press.
[346]
Cihang Xie, Mingxing Tan, Boqing Gong, Jiang Wang, Alan L. Yuille, and Quoc V. Le. 2020. Adversarial examples improve image recognition. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[347]
Lingxi Xie and Alan Yuille. 2017. Genetic cnn. In Proceedings of the International Conference on Computer Vision.
[348]
Saining Xie, Ross Girshick, Piotr Dollár, Zhuowen Tu, and Kaiming He. 2017. Aggregated residual transformations for deep neural networks. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[349]
Saining Xie, Alexander Kirillov, Ross Girshick, and Kaiming He. 2019. Exploring randomly wired neural networks for image recognition. In Proceedings of the International Conference on Computer Vision.
[350]
Saining Xie and Zhuowen Tu. 2015. Holistically nested edge detection. In Proceedings of the International Conference on Computer Vision.
[351]
Sirui Xie, Hehui Zheng, Chunxiao Liu, and Liang Lin. 2019. SNAS: Stochastic neural architecture search. In Proceedings of the International Conference on Learning Representations.
[352]
Xukai Xie, Yuan Zhou, and Sun-Yuan Kung. 2019. Exploiting operation importance for differentiable neural architecture search. Retrieved from https://arXiv:1911.10511.
[353]
Yunyang Xiong, Hanxiao Liu, Suyog Gupta, Berkin Akin, Gabriel Bender, Pieter-Jan Kindermans, Mingxing Tan, Vikas Singh, and Bo Chen. 2020. MobileDets: Searching for object detection architectures for mobile accelerators. Retrieved from https://arXiv:2004.14525.
[354]
Yunyang Xiong, Ronak Mehta, and Vikas Singh. 2019. Resource constrained neural network architecture search: Will a submodularity assumption help?. In Proceedings of the International Conference on Computer Vision.
[355]
Hang Xu, Lewei Yao, Wei Zhang, Xiaodan Liang, and Zhenguo Li. 2019. Auto-fpn: Automatic network architecture adaptation for object detection beyond classification. In Proceedings of the International Conference on Computer Vision.
[356]
Yixing Xu, Yunhe Wang, Kai Han, Hanting Chen, Yehui Tang, Shangling Jui, Chunjing Xu, Qi Tian, and Chang Xu. 2019. Renas: Architecture ranking for powerful networks. Retrieved from https://arXiv:1910.01523.
[357]
Yixing Xu, Yunhe Wang, Kai Han, Shangling Jui, Chunjing Xu, Qi Tian, and Chang Xu. 2020. ReNAS: Relativistic evaluation of neural architecture search, In Proceedings of the AAAI Conference on Artificial Intelligence.
[358]
Yuhui Xu, Lingxi Xie, Xiaopeng Zhang, Xin Chen, Guo-Jun Qi, Qi Tian, and Hongkai Xiong. 2020. Pc-darts: Partial channel connections for memory-efficient architecture search. In Proceedings of the International Conference on Learning Representations.
[359]
Yuhui Xu, Lingxi Xie, Xiaopeng Zhang, Xin Chen, Bowen Shi, Qi Tian, and Hongkai Xiong. 2020. Latency-aware differentiable neural architecture search. Retrieved from https://arXiv:2001.06392.
[360]
Shen Yan, Biyi Fang, Faen Zhang, Yu Zheng, Xiao Zeng, Mi Zhang, and Hui Xu. 2019. Hm-nas: Efficient neural architecture search via hierarchical masking. In Proceedings of the International Conference on Computer Vision Workshops.
[361]
Xingang Yan, Weiwen Jiang, Yiyu Shi, and Cheng Zhuo. 2020. MS-NAS: Multi-scale neural architecture search for medical image segmentation. Retrieved from https://arXiv:2007.06151.
[362]
Antoine Yang, Pedro M. Esperança, and Fabio M. Carlucci. 2020. NAS evaluation is frustratingly hard. In Proceedings of the International Conference on Learning Representations.
[363]
Sen Yang, Wankou Yang, and Zhen Cui. 2019. Pose neural fabrics search. Retrieved from https://arXiv:1909.07068.
[364]
Zhaohui Yang, Yunhe Wang, Xinghao Chen, Boxin Shi, Chao Xu, Chunjing Xu, Qi Tian, and Chang Xu. 2020. Cars: Continuous evolution for efficient neural architecture search. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[365]
Lewei Yao, Hang Xu, Wei Zhang, Xiaodan Liang, and Zhenguo Li. 2020. SM-NAS: Structural-to-modular neural architecture search for object detection. In Proceedings of the AAAI Conference on Artificial Intelligence.
[366]
Quanming Yao, Ju Xu, Wei-Wei Tu, and Zhanxing Zhu. 2020. Efficient neural architecture search via proximal iterations. In Proceedings of the AAAI Conference on Artificial Intelligence.
[367]
Xin Yao. 1999. Evolving artificial neural networks. Proc. IEEE 87, 9 (1999), 1423–1447.
[368]
Chris Ying, Aaron Klein, Eric Christiansen, Esteban Real, Kevin Murphy, and Frank Hutter. 2019. Nas-bench-101: Towards reproducible neural architecture search. In Proceedings of the International Conference on Machine Learning.
[369]
Hongyuan Yu and Houwen Peng. 2020. Cyclic differentiable architecture search. Retrieved from https://arXiv:2006.10724.
[370]
Jiahui Yu and Thomas Huang. 2019. Network slimming by slimmable networks: Towards one-shot architecture search for channel numbers. Retrieved from https://arXiv:1903.11728.
[371]
Jiahui Yu, Pengchong Jin, Hanxiao Liu, Gabriel Bender, Pieter-Jan Kindermans, Mingxing Tan, Thomas Huang, Xiaodan Song, Ruoming Pang, and Quoc Le. 2020. Bignas: Scaling up neural architecture search with big single-stage models. In Proceedings of the European Conference on Computer Vision.
[372]
Kaicheng Yu, Christian Sciuto, Martin Jaggi, Claudiu Musat, and Mathieu Salzmann. 2020. Evaluating the search phase of neural architecture search. In Proceedings of the International Conference on Learning Representations.
[373]
Qihang Yu, Dong Yang, Holger Roth, Yutong Bai, Yixiao Zhang, Alan L. Yuille, and Daguang Xu. 2020. C2FNAS: Coarse-to-fine neural architecture search for 3D medical image segmentation. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[374]
Tong Yu and Hong Zhu. 2020. Hyper-parameter optimization: A review of algorithms and applications. Retrieved from https://arXiv:2003.05689.
[375]
Ye Yu, Yingmin Li, Shuai Che, Niraj K. Jha, and Weifeng Zhang. 2020. Software-defined design space exploration for an efficient DNN accelerator architecture. IEEE Trans. Comput. 70, 1 (2020), 45–56.
[376]
Zhihang Yuan, Bingzhe Wu, Zheng Liang, Shiwan Zhao, Weichen Bi, and Guangyu Sun. 2019. S2DNAS: Transforming static CNN model for dynamic inference via neural architecture search. Retrieved from https://arXiv:1911.07033.
[377]
Arber Zela, Thomas Elsken, Tonmoy Saikia, Yassine Marrakchi, Thomas Brox, and Frank Hutter. 2020. Understanding and robustifying differentiable architecture search. In Proceedings of the International Conference on Learning Representations.
[378]
Arber Zela, Aaron Klein, Stefan Falkner, and Frank Hutter. 2018. Towards automated deep learning: Efficient joint neural architecture and hyperparameter search. Retrieved from https://arXiv:1807.06906.
[379]
Arber Zela, Julien Siems, and Frank Hutter. 2020. Nas-bench-1shot1: Benchmarking and dissecting one-shot neural architecture search. In Proceedings of the International Conference on Learning Representations.
[380]
Chris Zhang, Mengye Ren, and Raquel Urtasun. 2019. Graph hypernetworks for neural architecture search. In Proceedings of the International Conference on Learning Representations.
[381]
Haokui Zhang, Ying Li, Hao Chen, and Chunhua Shen. 2020. Memory-efficient hierarchical neural architecture search for image denoising. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[382]
Hang Zhang, Chongruo Wu, Zhongyue Zhang, Yi Zhu, Zhi Zhang, Haibin Lin, Yue Sun, Tong He, Jonas Mueller, R Manmatha, et al. 2020. Resnest: Split-attention networks. Retrieved from https://arXiv:2004.08955.
[383]
Hui Zhang, Quanming Yao, Mingkun Yang, Yongchao Xu, and Xiang Bai. 2020. Efficient backbone search for scene text recognition. Retrieved from https://arXiv:2003.06567.
[384]
Miao Zhang, Huiqi Li, Shirui Pan, Taoping Liu, and Steven Su. 2019. Efficient novelty-driven neural architecture search. Retrieved from https://arXiv:1907.09109.
[385]
Shizhou Zhang, Rui Cao, Xing Wei, Peng Wang, and Yanning Zhang. 2019. Person re-identification with neural architecture search. In Proceedings of the Conference on Pattern Recognition and Computer Vision.
[386]
Shuang Zhang, Liyao Xiang, Congcong Li, Yixuan Wang, Zeyu Liu, Quanshi Zhang, and Bo Li. 2019. Preventing information leakage with neural architecture search. Retrieved from https://arXiv:1912.08421.
[387]
Tunhou Zhang, Hsin-Pai Cheng, Zhenwen Li, Feng Yan, Chengyu Huang, Hai Helen Li, and Yiran Chen. 2020. AutoShrink: A topology-aware NAS for discovering efficient neural architecture. In Proceedings of the AAAI Conference on Artificial Intelligence.
[388]
Wei Zhang, Lin Zhao, Qing Li, Shijie Zhao, Qinglin Dong, Xi Jiang, Tuo Zhang, and Tianming Liu. 2019. Identify hierarchical structures from task-based fMRI data via hybrid spatiotemporal neural architecture search net. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention.
[389]
Xinbang Zhang, Zehao Huang, and Naiyan Wang. 2018. You only search once: Single shot neural architecture search via direct sparse optimization. Retrieved from https://arXiv:1811.01567.
[390]
Xinyu Zhang, Qiang Wang, Jian Zhang, and Zhao Zhong. 2020. Adversarial autoaugment. In Proceedings of the International Conference on Learning Representations.
[391]
Xiong Zhang, Hongmin Xu, Hong Mo, Jianchao Tan, Cheng Yang, and Wenqi Ren. 2020. DCNAS: Densely connected neural architecture search for semantic image segmentation. Retrieved from https://arXiv:2003.11883.
[392]
Yuge Zhang, Zejun Lin, Junyang Jiang, Quanlu Zhang, Yujing Wang, Hui Xue, Chen Zhang, and Yaming Yang. 2020. Deeper insights into weight sharing in neural architecture search. Retrieved from https://arXiv:2001.01431.
[393]
Yongqi Zhang, Quanming Yao, and Lei Chen. 2019. Neural recurrent structure search for knowledge graph embedding. Retrieved from https://arXiv:1911.07132.
[394]
Zijun Zhang, Linqi Zhou, Liangke Gou, and Ying Nian Wu. 2019. Neural architecture search for joint optimization of predictive power and biological knowledge. Retrieved from https://arXiv:1909.00337.
[395]
Liming Zhao, Mingjie Li, Depu Meng, Xi Li, Zhaoxiang Zhang, Yueting Zhuang, Zhuowen Tu, and Jingdong Wang. 2018. Deep convolutional neural networks with merge-and-run mappings. In Proceedings of the International Joint Conference on Artificial Intelligence.
[396]
Pengyu Zhao, Kecheng Xiao, Yuanxing Zhang, Kaigui Bian, and Wei Yan. 2020. AMER: Automatic behavior modeling and interaction exploration in recommender system. Retrieved from https://arXiv:2006.05933.
[397]
Ruizhe Zhao and Wayne Luk. 2019. Efficient structured pruning and architecture searching for group convolution. In Proceedings of the International Conference on Computer Vision Workshops.
[398]
Yiren Zhao, Duo Wang, Xitong Gao, Robert Mullins, Pietro Lio, and Mateja Jamnik. 2020. Probabilistic dual network architecture search on graphs. Retrieved from https://arXiv:2003.09676.
[399]
Xiawu Zheng, Rongrong Ji, Lang Tang, Yan Wan, Baochang Zhang, Yongjian Wu, Yunsheng Wu, and Ling Shao. 2019. Dynamic distribution pruning for efficient network architecture search. Retrieved from https://arXiv:1905.13543.
[400]
Xiawu Zheng, Rongrong Ji, Lang Tang, Baochang Zhang, Jianzhuang Liu, and Qi Tian. 2019. Multinomial distribution learning for effective neural architecture search. In Proceedings of the International Conference on Computer Vision.
[401]
Xiawu Zheng, Rongrong Ji, Qiang Wang, Qixiang Ye, Zhenguo Li, Yonghong Tian, and Qi Tian. 2020. Rethinking performance estimation in neural architecture search. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[402]
Zhao Zhong, Junjie Yan, Wei Wu, Jing Shao, and Cheng-Lin Liu. 2018. Practical block-wise neural network architecture generation. In Proceedings of the Conference on Computer Vision and Pattern Recognition.
[403]
Zhao Zhong, Zichen Yang, Boyang Deng, Junjie Yan, Wei Wu, Jing Shao, and Cheng-Lin Liu. 2021. BlockQNN: Efficient block-wise neural network architecture generation. IEEE Trans. Pattern Anal. Mach. Intell. 43, 7 (2021), 2314–2328.
[404]
Hongpeng Zhou, Minghao Yang, Jun Wang, and Wei Pan. 2019. BayesNAS: A bayesian approach for neural architecture search. In Proceedings of the International Conference on Machine Learning.
[405]
Kaixiong Zhou, Qingquan Song, Xiao Huang, and Xia Hu. 2019. Auto-gnn: Neural architecture search of graph neural networks. Retrieved from https://arXiv:1909.03184.
[406]
Peng Zhou, Lingxi Xie, Xiaopeng Zhang, Bingbing Ni, and Qi Tian. 2020. Searching towards class-aware generators for conditional generative adversarial networks. Retrieved from https://arXiv:2006.14208.
[407]
Pan Zhou, Caiming Xiong, Richard Socher, and Steven C. H. Hoi. 2020. Theory-inspired path-regularized differential network architecture search. Retrieved from https://arXiv:2006.16537.
[408]
Xin Zhou, Dejing Dou, and Boyang Li. 2019. Searching for stage-wise neural graphs in the limit. Retrieved from https://arXiv:1912.12860.
[409]
Yanqi Zhou, Siavash Ebrahimi, Sercan Ö. Arık, Haonan Yu, Hairong Liu, and Greg Diamos. 2018. Resource-efficient neural architect. Retrieved from https://arXiv:1806.07912.
[410]
Yizhou Zhou, Xiaoyan Sun, Chong Luo, Zheng-Jun Zha, and Wenjun Zeng. 2020. Posterior-guided neural architecture search. In Proceedings of the AAAI Conference on Artificial Intelligence.
[411]
Yanqi Zhou, Peng Wang, Sercan Arik, Haonan Yu, Syed Zawad, Feng Yan, and Greg Diamos. 2019. EPNAS: Efficient progressive neural architecture search. In Proceedings of the British Machine Vision Conference.
[412]
Hui Zhu, Zhulin An, Chuanguang Yang, Kaiqiang Xu, Erhu Zhao, and Yongjun Xu. 2019. EENA: Efficient evolution of neural architecture. In Proceedings of the International Conference on Computer Vision Workshops.
[413]
Zhuotun Zhu, Chenxi Liu, Dong Yang, Alan Yuille, and Daguang Xu. 2019. V-nas: Neural architecture search for volumetric medical image segmentation. In Proceedings of the International Conference on 3D Vision.
[414]
Barret Zoph and Quoc V. Le. 2017. Neural architecture search with reinforcement learning. In Proceedings of the International Conference on Learning Representations.
[415]
Barret Zoph, Vijay Vasudevan, Jonathon Shlens, and Quoc V. Le. 2018. Learning transferable architectures for scalable image recognition. In Proceedings of the Conference on Computer Vision and Pattern Recognition.

Cited By

View all
  • (2024)Multiscale Feature Search-Based Graph Convolutional Network for Hyperspectral Image ClassificationRemote Sensing10.3390/rs1613232816:13(2328)Online publication date: 26-Jun-2024
  • (2024)A Survey of Model Compression and Its Feedback Mechanism in Federated LearningProceedings of the 5th ACM Workshop on Intelligent Cross-Data Analysis and Retrieval10.1145/3643488.3660293(37-42)Online publication date: 10-Jun-2024
  • (2024)Toward DNN of LUTs: Learning Efficient Image Restoration With Multiple Look-Up TablesIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2024.340104846:12(8284-8301)Online publication date: Dec-2024
  • Show More Cited By

Index Terms

  1. Weight-Sharing Neural Architecture Search: A Battle to Shrink the Optimization Gap

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Computing Surveys
    ACM Computing Surveys  Volume 54, Issue 9
    December 2022
    800 pages
    ISSN:0360-0300
    EISSN:1557-7341
    DOI:10.1145/3485140
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 08 October 2021
    Accepted: 01 June 2021
    Revised: 01 April 2021
    Received: 01 September 2020
    Published in CSUR Volume 54, Issue 9

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. AutoML
    2. neural architecture search
    3. weight-sharing
    4. super-network
    5. optimization gap
    6. computer vision

    Qualifiers

    • Survey
    • Refereed

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)304
    • Downloads (Last 6 weeks)24
    Reflects downloads up to 24 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Multiscale Feature Search-Based Graph Convolutional Network for Hyperspectral Image ClassificationRemote Sensing10.3390/rs1613232816:13(2328)Online publication date: 26-Jun-2024
    • (2024)A Survey of Model Compression and Its Feedback Mechanism in Federated LearningProceedings of the 5th ACM Workshop on Intelligent Cross-Data Analysis and Retrieval10.1145/3643488.3660293(37-42)Online publication date: 10-Jun-2024
    • (2024)Toward DNN of LUTs: Learning Efficient Image Restoration With Multiple Look-Up TablesIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2024.340104846:12(8284-8301)Online publication date: Dec-2024
    • (2024)Zero-Shot Neural Architecture Search: Challenges, Solutions, and OpportunitiesIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2024.339542346:12(7618-7635)Online publication date: Dec-2024
    • (2024)PP-NAS: Searching for Plug-and-Play Blocks on Convolutional Neural NetworksIEEE Transactions on Neural Networks and Learning Systems10.1109/TNNLS.2023.326455135:9(12718-12730)Online publication date: Sep-2024
    • (2024)LDPC Decoding With Degree-Specific Neural Message Weights and RCQ DecodingIEEE Transactions on Communications10.1109/TCOMM.2023.334338772:4(1912-1924)Online publication date: Apr-2024
    • (2024)A Unified Search Framework for Data Augmentation and Neural Architecture on Small-Scale Image Data SetsIEEE Transactions on Cognitive and Developmental Systems10.1109/TCDS.2023.327417716:2(501-510)Online publication date: Apr-2024
    • (2024)Method for Expanding Search Space With Hybrid Operations in DynamicNASIEEE Access10.1109/ACCESS.2024.335073212(10242-10253)Online publication date: 2024
    • (2024)Advances in neural architecture searchNational Science Review10.1093/nsr/nwae282Online publication date: 23-Aug-2024
    • (2024)Momentum recursive DARTSPattern Recognition10.1016/j.patcog.2024.110710156(110710)Online publication date: Dec-2024
    • Show More Cited By

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media