Nothing Special   »   [go: up one dir, main page]

InFoBERT: Zero-Shot Approach to Natural Language Understanding Using Contextualized Word Embedding

Pavel Burnyshev, Andrey Bout, Valentin Malykh, Irina Piontkovskaya


Abstract
Natural language understanding is an important task in modern dialogue systems. It becomes more important with the rapid extension of the dialogue systems’ functionality. In this work, we present an approach to zero-shot transfer learning for the tasks of intent classification and slot-filling based on pre-trained language models. We use deep contextualized models feeding them with utterances and natural language descriptions of user intents to get text embeddings. These embeddings then used by a small neural network to produce predictions for intent and slot probabilities. This architecture achieves new state-of-the-art results in two zero-shot scenarios. One is a single language new skill adaptation and another one is a cross-lingual adaptation.
Anthology ID:
2021.ranlp-1.25
Volume:
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021)
Month:
September
Year:
2021
Address:
Held Online
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd.
Note:
Pages:
208–215
Language:
URL:
https://aclanthology.org/2021.ranlp-1.25
DOI:
Bibkey:
Cite (ACL):
Pavel Burnyshev, Andrey Bout, Valentin Malykh, and Irina Piontkovskaya. 2021. InFoBERT: Zero-Shot Approach to Natural Language Understanding Using Contextualized Word Embedding. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021), pages 208–215, Held Online. INCOMA Ltd..
Cite (Informal):
InFoBERT: Zero-Shot Approach to Natural Language Understanding Using Contextualized Word Embedding (Burnyshev et al., RANLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.ranlp-1.25.pdf
Data
SGD