Overcoming the Challenges of Batch Normalization in Federated Learning

R Guerraoui, R Pinot, G Rizk, J Stephan… - arXiv preprint arXiv …, 2024 - arxiv.org
arXiv preprint arXiv:2405.14670, 2024arxiv.org
Batch normalization has proven to be a very beneficial mechanism to accelerate the training
and improve the accuracy of deep neural networks in centralized environments. Yet, the
scheme faces significant challenges in federated learning, especially under high data
heterogeneity. Essentially, the main challenges arise from external covariate shifts and
inconsistent statistics across clients. We introduce in this paper Federated BatchNorm (FBN),
a novel scheme that restores the benefits of batch normalization in federated learning …
Batch normalization has proven to be a very beneficial mechanism to accelerate the training and improve the accuracy of deep neural networks in centralized environments. Yet, the scheme faces significant challenges in federated learning, especially under high data heterogeneity. Essentially, the main challenges arise from external covariate shifts and inconsistent statistics across clients. We introduce in this paper Federated BatchNorm (FBN), a novel scheme that restores the benefits of batch normalization in federated learning. Essentially, FBN ensures that the batch normalization during training is consistent with what would be achieved in a centralized execution, hence preserving the distribution of the data, and providing running statistics that accurately approximate the global statistics. FBN thereby reduces the external covariate shift and matches the evaluation performance of the centralized setting. We also show that, with a slight increase in complexity, we can robustify FBN to mitigate erroneous statistics and potentially adversarial attacks.
arxiv.org