UniADS: Universal Architecture-Distiller Search for Distillation Gap
DOI:
https://doi.org/10.1609/aaai.v38i13.29327Keywords:
ML: Transfer, Domain Adaptation, Multi-Task Learning, CV: Applications, DMKM: Graph Mining, Social Network Analysis & Community, ML: Deep Learning Algorithms, ML: Deep Learning Theory, ML: Semi-Supervised Learning, ML: Transparent, Interpretable, Explainable ML, ML: Unsupervised & Self-Supervised LearningAbstract
In this paper, we present UniADS, the first Universal Architecture-Distiller Search framework for co-optimizing student architecture and distillation policies. Teacher-student distillation gap limits the distillation gains. Previous approaches seek to discover the ideal student architecture while ignoring distillation settings. In UniADS, we construct a comprehensive search space encompassing an architectural search for student models, knowledge transformations in distillation strategies, distance functions, loss weights, and other vital settings. To efficiently explore the search space, we utilize the NSGA-II genetic algorithm for better crossover and mutation configurations and employ the Successive Halving algorithm for search space pruning, resulting in improved search efficiency and promising results. Extensive experiments are performed on different teacher-student pairs using CIFAR-100 and ImageNet datasets. The experimental results consistently demonstrate the superiority of our method over existing approaches. Furthermore, we provide a detailed analysis of the search results, examining the impact of each variable and extracting valuable insights and practical guidance for distillation design and implementation.Downloads
Published
2024-03-24
How to Cite
Lu, L., Chen, Z., Lu, X., Rao, Y., Li, L., & Pang, S. (2024). UniADS: Universal Architecture-Distiller Search for Distillation Gap. Proceedings of the AAAI Conference on Artificial Intelligence, 38(13), 14167-14174. https://doi.org/10.1609/aaai.v38i13.29327
Issue
Section
AAAI Technical Track on Machine Learning IV