Bossnas: Exploring hybrid cnn-transformers with block-wisely self-supervised neural architecture search. C Li, T Tang, G Wang, J Peng, B Wang, X Liang, X Chang.
Expandera AI Inc. - Cité(e) 543 fois - Automatic Machine Learning
Jiefeng Peng
Research interests: Automatic Machine Learning
Kalman normalization: Normalizing internal representations across network layers. G Wang, J Peng, P Luo, X Wang, L Lin. Advances in neural information ...
Pre-trained language models have shown remarkable results on various NLP tasks. Nevertheless, due to their bulky size and slow inference speed, ...
Lingbo LiuAssociate Researcher, Peng Cheng Laboratoryبريد إلكتروني تم التحقق منه على pcl.ac.cn. متابعة. Jiefeng Peng. Expandera AI Inc ...
Jiefeng Peng · Most frequent co-Author · Most cited colleague · Most frequent Affiliation.
Jiefeng Peng. Zhejiang University | ZJU · State Key Lab of Fluid Power Transmission and Control.
In this paper, we have a critical insight that improving the feed-forward network (FFN) in BERT has a higher gain than improving the multi-head attention (MHA) ...