Nothing Special   »   [go: up one dir, main page]

deepQuest-py: Large and Distilled Models for Quality Estimation

Fernando Alva-Manchego, Abiola Obamuyide, Amit Gajbhiye, Frédéric Blain, Marina Fomicheva, Lucia Specia


Abstract
We introduce deepQuest-py, a framework for training and evaluation of large and light-weight models for Quality Estimation (QE). deepQuest-py provides access to (1) state-of-the-art models based on pre-trained Transformers for sentence-level and word-level QE; (2) light-weight and efficient sentence-level models implemented via knowledge distillation; and (3) a web interface for testing models and visualising their predictions. deepQuest-py is available at https://github.com/sheffieldnlp/deepQuest-py under a CC BY-NC-SA licence.
Anthology ID:
2021.emnlp-demo.42
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Heike Adel, Shuming Shi
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
382–389
Language:
URL:
https://aclanthology.org/2021.emnlp-demo.42
DOI:
10.18653/v1/2021.emnlp-demo.42
Bibkey:
Cite (ACL):
Fernando Alva-Manchego, Abiola Obamuyide, Amit Gajbhiye, Frédéric Blain, Marina Fomicheva, and Lucia Specia. 2021. deepQuest-py: Large and Distilled Models for Quality Estimation. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, pages 382–389, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
deepQuest-py: Large and Distilled Models for Quality Estimation (Alva-Manchego et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-demo.42.pdf
Video:
 https://aclanthology.org/2021.emnlp-demo.42.mp4
Code
 sheffieldnlp/deepQuest-py
Data
MLQE