Nothing Special   »   [go: up one dir, main page]

SwitchPrompt: Learning Domain-Specific Gated Soft Prompts for Classification in Low-Resource Domains

Koustava Goswami, Lukas Lange, Jun Araki, Heike Adel


Abstract
Prompting pre-trained language models leads to promising results across natural language processing tasks but is less effective when applied in low-resource domains, due to the domain gap between the pre-training data and the downstream task. In this work, we bridge this gap with a novel and lightweight prompting methodology called SwitchPrompt for the adaptation of language models trained on datasets from the general domain to diverse low-resource domains. Using domain-specific keywords with a trainable gated prompt, SwitchPrompt offers domain-oriented prompting, that is, effective guidance on the target domains for general-domain language models. Our few-shot experiments on three text classification benchmarks demonstrate the efficacy of the general-domain pre-trained language models when used with SwitchPrompt. They often even outperform their domain-specific counterparts trained with baseline state-of-the-art prompting methods by up to 10.7% performance increase in accuracy. This result indicates that SwitchPrompt effectively reduces the need for domain-specific language model pre-training.
Anthology ID:
2023.eacl-main.197
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2689–2695
Language:
URL:
https://aclanthology.org/2023.eacl-main.197
DOI:
10.18653/v1/2023.eacl-main.197
Bibkey:
Cite (ACL):
Koustava Goswami, Lukas Lange, Jun Araki, and Heike Adel. 2023. SwitchPrompt: Learning Domain-Specific Gated Soft Prompts for Classification in Low-Resource Domains. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 2689–2695, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
SwitchPrompt: Learning Domain-Specific Gated Soft Prompts for Classification in Low-Resource Domains (Goswami et al., EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.197.pdf
Video:
 https://aclanthology.org/2023.eacl-main.197.mp4