FedAL: Black-Box Federated Knowledge Distillation Enabled by Adversarial Learning
Abstract
References
Index Terms
- FedAL: Black-Box Federated Knowledge Distillation Enabled by Adversarial Learning
Recommendations
: Toward Heterogeneous Federated Learning via Global Knowledge Distillation
Federated learning, as one enabling technology of edge intelligence, has gained substantial attention due to its efficacy in training deep learning models without data privacy and network bandwidth concerns. However, due to the heterogeneity of the edge ...
Federated Learning with Label-Masking Distillation
MM '23: Proceedings of the 31st ACM International Conference on MultimediaFederated learning provides a privacy-preserving manner to collaboratively train models on data distributed over multiple local clients via the coordination of a global server. In this paper, we focus on label distribution skew in federated learning, ...
Personalized Federated Learning via Backbone Self-Distillation
MMAsia '23: Proceedings of the 5th ACM International Conference on Multimedia in AsiaIn practical scenarios, federated learning frequently necessitates training personalized models for each client using heterogeneous data. This paper proposes a backbone self-distillation approach to facilitate personalized federated learning. In this ...
Comments
Please enable JavaScript to view thecomments powered by Disqus.Information & Contributors
Information
Published In
Publisher
IEEE Press
Publication History
Qualifiers
- Research-article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 0Total Downloads
- Downloads (Last 12 months)0
- Downloads (Last 6 weeks)0