Nothing Special   »   [go: up one dir, main page]

Efficient Nearest Neighbor Language Models

Junxian He, Graham Neubig, Taylor Berg-Kirkpatrick


Abstract
Non-parametric neural language models (NLMs) learn predictive distributions of text utilizing an external datastore, which allows them to learn through explicitly memorizing the training datapoints. While effective, these models often require retrieval from a large datastore at test time, significantly increasing the inference overhead and thus limiting the deployment of non-parametric NLMs in practical applications. In this paper, we take the recently proposed k-nearest neighbors language model as an example, exploring methods to improve its efficiency along various dimensions. Experiments on the standard WikiText-103 benchmark and domain-adaptation datasets show that our methods are able to achieve up to a 6x speed-up in inference speed while retaining comparable performance. The empirical analysis we present may provide guidelines for future research seeking to develop or deploy more efficient non-parametric NLMs.
Anthology ID:
2021.emnlp-main.461
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5703–5714
Language:
URL:
https://aclanthology.org/2021.emnlp-main.461
DOI:
10.18653/v1/2021.emnlp-main.461
Bibkey:
Cite (ACL):
Junxian He, Graham Neubig, and Taylor Berg-Kirkpatrick. 2021. Efficient Nearest Neighbor Language Models. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 5703–5714, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Efficient Nearest Neighbor Language Models (He et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.461.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.461.mp4
Code
 jxhe/efficient-knnlm +  additional community code
Data
WikiText-103WikiText-2