This paper presents a novel pre-trained language models (PLM) compression approach based on the matrix product operator (short as MPO) from quantum many-body ...
Jun 4, 2021 · This paper presents a novel pre-trained language models (PLM) compression approach based on the matrix product operator (short as MPO) from quantum many-body ...
Aug 1, 2021 · In this paper, we introduce a novel matrix prod- uct operator (MPO) technique from quantum many- body physics for compressing PLMs (Gao et al.,.
MPO factorizes a matrix into a sequential product of local tensors. Motivation: Can we compress the central tensor for parameter reduction and update auxiliary.
This paper presents a novel pre-trained language models (PLM) compression approach based on the matrix product operator (short as MPO) from quantum many-body ...
Jun 4, 2021 · A novel pre-trained language models (PLM) compression approach based on the matrix product operator (short as MPO) from quantum many-body ...
Jun 8, 2021 · This paper presents a novel pre-trained language models (PLM) compression approach based on the matrix product operator (short as MPO) from ...
A survey of large language models. WX ... 2022. Enabling lightweight fine-tuning for pre-trained language model compression based on matrix product operators.
In this paper, we propose a model compression method based on matrix product operators (MPO) to substantially reduce the number of parameters in DNN models for ...
In this paper, we propose a model compression method based on matrix product operators (MPO) to substantially reduce the number of parameters in DNN models for ...