Nothing Special   »   [go: up one dir, main page]

CascadeBERT: Accelerating Inference of Pre-trained Language Models via Calibrated Complete Models Cascade

Lei Li, Yankai Lin, Deli Chen, Shuhuai Ren, Peng Li, Jie Zhou, Xu Sun


Abstract
Dynamic early exiting aims to accelerate the inference of pre-trained language models (PLMs) by emitting predictions in internal layers without passing through the entire model. In this paper, we empirically analyze the working mechanism of dynamic early exiting and find that it faces a performance bottleneck under high speed-up ratios. On one hand, the PLMs’ representations in shallow layers lack high-level semantic information and thus are not sufficient for accurate predictions. On the other hand, the exiting decisions made by internal classifiers are unreliable, leading to wrongly emitted early predictions. We instead propose a new framework for accelerating the inference of PLMs, CascadeBERT, which dynamically selects proper-sized and complete models in a cascading manner, providing comprehensive representations for predictions. We further devise a difficulty-aware objective, encouraging the model to output the class probability that reflects the real difficulty of each instance for a more reliable cascading mechanism. Experimental results show that CascadeBERT can achieve an overall 15% improvement under 4x speed-up compared with existing dynamic early exiting methods on six classification tasks, yielding more calibrated and accurate predictions.
Anthology ID:
2021.findings-emnlp.43
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
475–486
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.43
DOI:
10.18653/v1/2021.findings-emnlp.43
Bibkey:
Cite (ACL):
Lei Li, Yankai Lin, Deli Chen, Shuhuai Ren, Peng Li, Jie Zhou, and Xu Sun. 2021. CascadeBERT: Accelerating Inference of Pre-trained Language Models via Calibrated Complete Models Cascade. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 475–486, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
CascadeBERT: Accelerating Inference of Pre-trained Language Models via Calibrated Complete Models Cascade (Li et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.43.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.43.mp4
Code
 lancopku/cascadebert
Data
GLUEMRPCMultiNLIQNLISSTSST-2