Multi-Source and Multi-Target Domain Adaptation Based on Dynamic Generator With Attention
As a branch of domain adaptation (DA), multi-source DA (MSDA) is a challenging issue that
aims to transfer knowledge from multiple well-labeled source domains to a target domain for
target tasks. However, most existing related works focus on single-target domain adaptation,
and multiple target domain adaptation is not accounted for. We believe that multiple target
domains provide valuable knowledge. Meanwhile, in multi-source and multi-target
adaptation scenarios, feature generators with static parameters have difficulty generating …
aims to transfer knowledge from multiple well-labeled source domains to a target domain for
target tasks. However, most existing related works focus on single-target domain adaptation,
and multiple target domain adaptation is not accounted for. We believe that multiple target
domains provide valuable knowledge. Meanwhile, in multi-source and multi-target
adaptation scenarios, feature generators with static parameters have difficulty generating …
As a branch of domain adaptation (DA), multi-source DA (MSDA) is a challenging issue that aims to transfer knowledge from multiple well-labeled source domains to a target domain for target tasks. However, most existing related works focus on single-target domain adaptation, and multiple target domain adaptation is not accounted for. We believe that multiple target domains provide valuable knowledge. Meanwhile, in multi-source and multi-target adaptation scenarios, feature generators with static parameters have difficulty generating deep features of each individual domain. In this article, we propose a Dynamic Generator With Attention (DGWA) method for multi-source and multi-target domain adaptation to adapt domain-agnostic deep features in multi source and multi target domain scenarios. The feature generator with dynamic parameters can dynamically change its parameters with data input from different domains, which greatly improves the generalization of the feature pools. An attention mechanism is used in our DGWA to learn more transferable information from different domains. To demonstrate the performance of DGWA, we conduct extensive experiments on several popular domain adaptation datasets, including the digits, Office+Caltech10, Office-Home, and ImageCLEF-DA datasets. The experimental results demonstrate that our method performs better than state-of-the-art methods.
ieeexplore.ieee.org
Showing the best result for this search. See all results