Nothing Special   »   [go: up one dir, main page]

Interventional Training for Out-Of-Distribution Natural Language Understanding

Sicheng Yu, Jing Jiang, Hao Zhang, Yulei Niu, Qianru Sun, Lidong Bing


Abstract
Out-of-distribution (OOD) settings are used to measure a model’s performance when the distribution of the test data is different from that of the training data. NLU models are known to suffer in OOD. We study this issue from the perspective of causality, which sees confounding bias as the reason for models to learn spurious correlations. While a common solution is to perform intervention, existing methods handle only known and single confounder, but in many NLU tasks the confounders can be both unknown and multifactorial. In this paper, we propose a novel interventional training method called Bottom-up Automatic Intervention (BAI) that performs multi-granular intervention with identified multifactorial confounders. Our experiments on three NLU tasks, namely, natural language inference, fact verification and paraphrase identification, show the effectiveness of BAI for tackling OOD settings.
Anthology ID:
2022.emnlp-main.799
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11627–11638
Language:
URL:
https://aclanthology.org/2022.emnlp-main.799
DOI:
10.18653/v1/2022.emnlp-main.799
Bibkey:
Cite (ACL):
Sicheng Yu, Jing Jiang, Hao Zhang, Yulei Niu, Qianru Sun, and Lidong Bing. 2022. Interventional Training for Out-Of-Distribution Natural Language Understanding. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 11627–11638, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Interventional Training for Out-Of-Distribution Natural Language Understanding (Yu et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.799.pdf