Multi-source domain adaptation with mixture of experts

J Guo, DJ Shah, R Barzilay - arXiv preprint arXiv:1809.02256, 2018 - arxiv.org
arXiv preprint arXiv:1809.02256, 2018arxiv.org
We propose a mixture-of-experts approach for unsupervised domain adaptation from
multiple sources. The key idea is to explicitly capture the relationship between a target
example and different source domains. This relationship, expressed by a point-to-set metric,
determines how to combine predictors trained on various domains. The metric is learned in
an unsupervised fashion using meta-training. Experimental results on sentiment analysis
and part-of-speech tagging demonstrate that our approach consistently outperforms multiple …
We propose a mixture-of-experts approach for unsupervised domain adaptation from multiple sources. The key idea is to explicitly capture the relationship between a target example and different source domains. This relationship, expressed by a point-to-set metric, determines how to combine predictors trained on various domains. The metric is learned in an unsupervised fashion using meta-training. Experimental results on sentiment analysis and part-of-speech tagging demonstrate that our approach consistently outperforms multiple baselines and can robustly handle negative transfer.
arxiv.org