Communication-Efficient Personalized Federated Learning on Non-IID Data
X Li, C Ma, B Huang, G Li - 2023 19th International Conference …, 2023 - ieeexplore.ieee.org
X Li, C Ma, B Huang, G Li
2023 19th International Conference on Mobility, Sensing and …, 2023•ieeexplore.ieee.orgIn this paper, we explore the challenges associated with federated learning, a distributed
machine learning paradigm that promotes collaborative model training while preserving the
privacy of local client data. One significant hurdle is the non-IID nature of clients' data,
alongside limited communication resources between clients and the cloud server. These
statistical heterogeneity and communication resource limitations pose practical obstacles to
the implementation of federated learning. To address these challenges, we propose a …
machine learning paradigm that promotes collaborative model training while preserving the
privacy of local client data. One significant hurdle is the non-IID nature of clients' data,
alongside limited communication resources between clients and the cloud server. These
statistical heterogeneity and communication resource limitations pose practical obstacles to
the implementation of federated learning. To address these challenges, we propose a …
In this paper, we explore the challenges associated with federated learning, a distributed machine learning paradigm that promotes collaborative model training while preserving the privacy of local client data. One significant hurdle is the non-IID nature of clients’ data, alongside limited communication resources between clients and the cloud server. These statistical heterogeneity and communication resource limitations pose practical obstacles to the implementation of federated learning. To address these challenges, we propose a communication-efficient framework called GCPFL for personalized federated learning. Our framework empowers individual clients to train personalized models while substantially reducing communication costs. Specifically, each client compresses the gradient before uploading it and handles the effects of gradient compression through an error correction process. By uploading only the compressed gradients, the communication costs are significantly diminished. On the cloud server side, the received gradients are recovered into models, and similarity aggregation is performed on these models to facilitate collaboration among clients. Once the aggregated models are received, clients conduct local updates to acquire personalized models. Extensive experimental results illustrate that the GCPFL algorithm not only achieves high model accuracy but also substantially reduces communication costs compared to existing methods.
ieeexplore.ieee.org
Showing the best result for this search. See all results