Nothing Special   »   [go: up one dir, main page]

×
Please click here if you are not redirected within a few seconds.
Bossnas: Exploring hybrid cnn-transformers with block-wisely self-supervised neural architecture search. C Li, T Tang, G Wang, J Peng, B Wang, X Liang, X Chang.
Expandera AI Inc. - ‪‪Cité(e) 543 fois‬‬ - ‪Automatic Machine Learning‬

Jiefeng Peng

Research interests: Automatic Machine Learning
Kalman normalization: Normalizing internal representations across network layers. G Wang, J Peng, P Luo, X Wang, L Lin. Advances in neural information ...
Pre-trained language models have shown remarkable results on various NLP tasks. Nevertheless, due to their bulky size and slow inference speed, ...
Jiefeng Peng from www.researchgate.net
Jiefeng PENG | Cited by 433 | of Sun Yat-Sen University, Guangzhou (SYSU) | Read 13 publications | Contact Jiefeng PENG.
Jiefeng Peng from www.uaa.alaska.edu
Research Interests: Computational fluid dynamics (CFD), propulsion, wind and hydro energy, micro-grid, environmental transports, pipeline flows.
Lingbo LiuAssociate Researcher, Peng Cheng Laboratoryبريد إلكتروني تم التحقق منه على pcl.ac.cn. متابعة. Jiefeng Peng. Expandera AI Inc ...
Jiefeng Peng · Most frequent co-Author · Most cited colleague · Most frequent Affiliation.
Jiefeng Peng. Zhejiang University | ZJU · State Key Lab of Fluid Power Transmission and Control.
In this paper, we have a critical insight that improving the feed-forward network (FFN) in BERT has a higher gain than improving the multi-head attention (MHA) ...