Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Neural-aware Decoupling Fusion based Personalized Federated Learning for Intelligent Sensing

Published: 22 November 2024 Publication History

Abstract

Personalized federated learning (PFL) is a framework that targets individual models for optimization, providing better privacy and flexibility for clients. However, in challenging intelligent sensing applications, the heterogeneous client’s data distributions make the aggregation of local models in the server unstable or even hard to converge. To deal with the performance degradation caused by the preceding problem, existing PFL methods focus more on how to fine-tune the global model but ignore the impact of the global model fusion algorithm on the results. In this article, we propose a new explainable neural-aware decoupling fusion based PFL framework, p-FedADF, to address the preceding challenges. It contains two carefully designed modules. The local decoupling module, deployed on the client, utilizes the architecture disentangle technique to decouple the feature extractors in the client’s local model into sub-network according to data categories. It obtains the inference process of feature extraction for different categories of data by training. The global aggregation module, deployed on the server, aligns the sub-network positions for multiple clients and implements a fine-grained generic feature extractor aggregation. In addition, we provide a mask encoding scheme to reduce the communication overhead of transmitting the sub-network sets between the server and clients. Our p-FedADF obtains 1.6%, 0.2%, 2.3%, and 4.5% improvement on a real-world dataset and three benchmark datasets, compared to state-of-the-art methods.

References

[1]
Durmus Alp Emre Acar, Yue Zhao, Ramon Matas Navarro, Matthew Mattina, Paul N. Whatmough, and Venkatesh Saligrama. 2021. Federated learning based on dynamic regularization. arXiv preprint arXiv:2111.04263 (2021).
[2]
Manoj Ghuhan Arivazhagan, Vinay Aggarwal, Aaditya Kumar Singh, and Sunav Choudhary. 2019. Federated learning with personalization layers. arXiv preprint arXiv:1912.00818 (2019).
[3]
Sameer Bibikar, Haris Vikalo, Zhangyang Wang, and Xiaohan Chen. 2021. Federated dynamic sparse training: Computing less, communicating less, yet learning better. arXiv preprint arXiv:2112.09824 (2021).
[4]
Hong-You Chen and Wei-Lun Chao. 2020. FedBE: Making Bayesian model ensemble applicable to federated learning. arXiv preprint arXiv:2009.01974 (2020).
[5]
Hong-You Chen and Wei-Lun Chao. 2021. On bridging generic and personalized federated learning for image classification. In Proceedings of the International Conference on Learning Representations.
[6]
Yiqiang Chen, Xin Qin, Jindong Wang, Chaohui Yu, and Wen Gao. 2020. FedHealth: A federated transfer learning framework for wearable healthcare. IEEE Intelligent Systems 35, 4 (2020), 83–93.
[7]
Luca Corinzia, Ami Beuret, and Joachim M. Buhmann. 2019. Variational federated multi-task learning. arXiv preprint arXiv:1906.06268 (2019).
[8]
Rong Dai, Li Shen, Fengxiang He, Xinmei Tian, and Dacheng Tao. 2022. DisPFL: Towards communication-efficient personalized federated learning via decentralized sparse training. arXiv preprint arXiv:2206.00187 (2022).
[9]
Li Deng. 2012. The MNIST database of handwritten digit images for machine learning research [Best of the Web]. IEEE Signal Processing Magazine 29, 6 (2012), 141–142.
[10]
Yuyang Deng, Mohammad Mahdi Kamani, and Mehrdad Mahdavi. 2020. Adaptive personalized federated learning. arXiv preprint arXiv:2003.13461 (2020).
[11]
Yongheng Deng, Feng Lyu, Ju Ren, Yi-Chao Chen, Peng Yang, Yuezhi Zhou, and Yaoxue Zhang. 2021. Fair: Quality-aware federated learning with precise user incentive and model aggregation. In Proceedings of the 2021 IEEE Conference on Computer Communications(INFOCOM ’21). IEEE, 1–10.
[12]
Raveen Doon, Tarun Kumar Rawat, and Shweta Gautam. 2018. Cifar-10 classification using deep convolutional neural network. In Proceedings of the 2018 IEEE PuneCon. IEEE, 1–5.
[13]
Qingyun Duan, Newsha K. Ajami, Xiaogang Gao, and Soroosh Sorooshian. 2007. Multi-model ensemble hydrologic prediction using Bayesian model averaging. Advances in Water Resources 30, 5 (2007), 1371–1386.
[14]
Ahmed El-Sawy, Hazem El-Bakry, and Mohamed Loey. 2016. CNN for handwritten arabic digits recognition based on LeNet-5. In Proceedings of the International Conference on Advanced Intelligent Systems and Informatics. 566–575.
[15]
Alireza Fallah, Aryan Mokhtari, and Asuman Ozdaglar. 2020. Personalized federated learning: A meta-learning approach. arXiv preprint arXiv:2002.07948 (2020).
[16]
Chelsea Finn, Pieter Abbeel, and Sergey Levine. 2017. Model-agnostic meta-learning for fast adaptation of deep networks. In Proceedings of the International Conference on Machine Learning. 1126–1135.
[17]
Meng Hao, Hongwei Li, Xizhao Luo, Guowen Xu, Haomiao Yang, and Sen Liu. 2019. Efficient and privacy-enhanced federated learning for industrial artificial intelligence. IEEE Transactions on Industrial Informatics 16, 10 (2019), 6532–6542.
[18]
Md. Rafiul Hassan, Baikunth Nath, and Michael Kirley. 2007. A fusion model of HMM, ANN and GA for stock market forecasting. Expert Systems with Applications 33, 1 (2007), 171–180.
[19]
Geoffrey Hinton, Oriol Vinyals, and Jeff Dean. 2015. Distilling the knowledge in a neural network. arXiv preprint arXiv:1503.02531 2, 7 (2015).
[20]
Kevin Hsieh, Amar Phanishayee, Onur Mutlu, and Phillip Gibbons. 2020. The non-IID data quagmire of decentralized machine learning. In Proceedings of the International Conference on Machine Learning. 4387–4398.
[21]
Tzu-Ming Harry Hsu, Hang Qi, and Matthew Brown. 2019. Measuring the effects of non-identical data distribution for federated visual classification. arXiv preprint arXiv:1909.06335 (2019).
[22]
Jie Hu, Liujuan Cao, Tong Tong, Qixiang Ye, Shengchuan Zhang, Ke Li, Feiyue Huang, Ling Shao, and Rongrong Ji. 2021. Architecture disentanglement for deep neural networks. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 672–681.
[23]
Tiansheng Huang, Weiwei Lin, Li Shen, Keqin Li, and Albert Y. Zomaya. 2022. Stochastic client selection for federated learning with volatile clients. IEEE Internet of Things Journal 9, 20 (2022), 20055–20070.
[24]
Tiansheng Huang, Shiwei Liu, Li Shen, Fengxiang He, Weiwei Lin, and Dacheng Tao. 2022. Achieving personalized federated learning with sparse local models. arXiv preprint arXiv:2201.11380 (2022).
[25]
Yutao Huang, Lingyang Chu, Zirui Zhou, Lanjun Wang, Jiangchuan Liu, Jian Pei, and Yong Zhang. 2021. Personalized cross-silo federated learning on non-IID data. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 7865–7873.
[26]
Ahmed Imteaj, Urmish Thakker, Shiqiang Wang, Jian Li, and M. Hadi Amini. 2021. A survey on federated learning for resource-constrained iot devices. IEEE Internet of Things Journal 9, 1 (2021), 1–24.
[27]
Sai Praneeth Karimireddy, Satyen Kale, Mehryar Mohri, Sashank Reddi, Sebastian Stich, and Ananda Theertha Suresh. 2020. SCAFFOLD: Stochastic controlled averaging for federated learning. In Proceedings of the International Conference on Machine Learning. 5132–5143.
[28]
Alex Krizhevsky. 2009. Learning Multiple Layers of Features from Tiny Images. University of Toronto.
[29]
Alex Krizhevsky, Ilya Sutskever, and Geoffrey E. Hinton. 2012. ImageNet classification with deep convolutional neural networks. Advances in Neural Information Processing Systems 25 (2012), 1–9.
[30]
Viraj Kulkarni, Milind Kulkarni, and Aniruddha Pant. 2020. Survey of personalization techniques for federated learning. In Proceedings of the 2020 4th World Conference on Smart Trends in Systems, Security, and Sustainability (WorldS4 ’20). IEEE, 794–797.
[31]
Mikhail Iu Leontev, Viktoriia Islenteva, and Sergey V. Sukhov. 2020. Non-iterative knowledge fusion in deep convolutional neural networks. Neural Processing Letters 51, 1 (2020), 1–22.
[32]
Qinbin Li, Bingsheng He, and Dawn Song. 2021. Model-contrastive federated learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 10713–10722.
[33]
Qi Li, Bin Xia, Haiping Huang, Yinghui Zhang, and Tao Zhang. 2021. TRAC: Traceable and Revocable Access Control scheme for mHealth in 5G-enabled IIoT. IEEE Transactions on Industrial Informatics. Published Online, September 2021.
[34]
Tian Li, Anit Kumar Sahu, Manzil Zaheer, Maziar Sanjabi, Ameet Talwalkar, and Virginia Smith. 2020. Federated optimization in heterogeneous networks. Proceedings of Machine Learning and Systems 2 (2020), 429–450.
[35]
Yixuan Li, Jason Yosinski, Jeff Clune, Hod Lipson, and John E. Hopcroft. 2015. Convergent learning: Do different neural networks learn the same representations? In Proceedings of the 1st International Workshop on Feature Extraction: Modern Questions and Challenges at NIPS 2015 (FE@NIPS ’15). 196–212.
[36]
Tao Lin, Lingjing Kong, Sebastian U. Stich, and Martin Jaggi. 2020. Ensemble distillation for robust model fusion in federated learning. Advances in Neural Information Processing Systems 33 (2020), 2351–2363.
[37]
Chang Liu, Chenfei Lou, Runzhong Wang, Alan Yuhan Xi, Li Shen, and Junchi Yan. 2022. Deep neural network fusion via graph matching with applications to model ensemble and federated learning. In Proceedings of the International Conference on Machine Learning. 13857–13869.
[38]
Liang Liu, Wu Liu, Yu Zheng, Huadong Ma, and Cheng Zhang. 2018. Third-Eye: A mobilephone-enabled crowdsensing system for air quality monitoring. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 2, 1 (2018), 1–26.
[39]
Grigory Malinovskiy, Dmitry Kovalev, Elnur Gasanov, Laurent Condat, and Peter Richtarik. 2020. From local SGD to local fixed-point methods for federated learning. In Proceedings of the International Conference on Machine Learning. 6692–6701.
[40]
Yishay Mansour, Mehryar Mohri, Jae Ro, and Ananda Theertha Suresh. 2020. Three approaches for personalization with applications to federated learning. arXiv preprint arXiv:2002.10619 (2020).
[41]
Brendan McMahan, Eider Moore, Daniel Ramage, Seth Hampson, and Blaise Aguera y Arcas. 2017. Communication-efficient learning of deep networks from decentralized data. In Artificial Intelligence and Statistics. PMLR, 1273–1282.
[42]
Ari S. Morcos, David G. T. Barrett, Neil C. Rabinowitz, and Matthew Botvinick. 2018. On the importance of single directions for generalization. arXiv preprint arXiv:1803.06959 (2018).
[43]
Dang Nguyen, Khai Nguyen, Dinh Phung, Hung Bui, and Nhat Ho. 2021. Model fusion of heterogeneous neural networks via cross-layer alignment. arXiv preprint arXiv:2110.15538 (2021).
[44]
Alex Nichol, Joshua Achiam, and John Schulman. 2018. On first-order meta-learning algorithms. arXiv preprint arXiv:1803.02999 (2018).
[45]
Reese Pathak and Martin J. Wainwright. 2020. FedSplit: An algorithmic framework for fast federated optimization. Advances in Neural Information Processing Systems 33 (2020), 7057–7066.
[46]
Sashank Reddi, Zachary Charles, Manzil Zaheer, Zachary Garrett, Keith Rush, Jakub Konečnỳ, Sanjiv Kumar, and H. Brendan McMahan. 2020. Adaptive federated optimization. arXiv preprint arXiv:2003.00295 (2020).
[47]
Matthias Reisser, Christos Louizos, Efstratios Gavves, and Max Welling. 2021. Federated mixture of experts. arXiv preprint arXiv:2107.06724 (2021).
[48]
Gerd Ronning. 1989. Maximum likelihood estimation of Dirichlet distributions. Journal of Statistical Computation and Simulation 32, 4 (1989), 215–221.
[49]
Anit Kumar Sahu, Tian Li, Maziar Sanjabi, Manzil Zaheer, Ameet Talwalkar, and Virginia Smith. 2018. On the convergence of federated optimization in heterogeneous networks. arXiv preprint arXiv:1812.06127 (2018).
[50]
Karen Simonyan and Andrew Zisserman. 2014. Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 (2014).
[51]
Sidak Pal Singh and Martin Jaggi. 2020. Model fusion via optimal transport. Advances in Neural Information Processing Systems 33 (2020), 22045–22055.
[52]
Joshua Smith and Michael Gashler. 2017. An investigation of how neural networks learn from the experiences of peers through periodic weight averaging. In Proceedings of the 2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA ’17). IEEE, 731–736.
[53]
Virginia Smith, Chao-Kai Chiang, Maziar Sanjabi, and Ameet S. Talwalkar. 2017. Federated multi-task learning. Advances in Neural Information Processing Systems 30 (2017), 1–11.
[54]
Sebastian U. Stich. 2018. Local SGD converges fast and communicates little. arXiv preprint arXiv:1805.09767 (2018).
[55]
Alysa Ziying Tan, Han Yu, Lizhen Cui, and Qiang Yang. 2022. Towards personalized federated learning. IEEE Transactions on Neural Networks and Learning Systems. Published Online, March 28, 2022.
[56]
Zuoqi Tang, Feifei Shao, Long Chen, Yunan Ye, Chao Wu, and Jun Xiao. 2021. Optimizing federated learning on non-IID data using local Shapley value. In Artificial Intelligence (CICAI 2021). Lecture Notes in Computer Science, Vol. 13070. Springer, 164–175.
[57]
Ralf Tönjes, M. Ali, P. Barnaghi, Sorin Ganea, F. Ganz, Manfred Haushwirth, Brigitte Kjærgaard, Daniel Kumper, A. Mileo, Septimiu Nechifor, A. Sheth, and Lasse Vestergaard. 2014. Real time IoT stream processing and large-scale data analytics for smart city applications. In Proceedings of the European Conference on Networks and Communications. Poster Session.
[58]
Joachim Utans. 1996. Weight averaging for neural networks and local resampling schemes. In Proceedings of the Workshop on Integrating Multiple Learned Models. 133–138.
[59]
Hongyi Wang, Mikhail Yurochkin, Yuekai Sun, Dimitris Papailiopoulos, and Yasaman Khazaeni. 2020. Federated learning with matched averaging. arXiv preprint arXiv:2002.06440 (2020).
[60]
Jianyu Wang, Qinghua Liu, Hao Liang, Gauri Joshi, and H. Vincent Poor. 2020. Tackling the objective inconsistency problem in heterogeneous federated optimization. Advances in Neural Information Processing Systems 33 (2020), 7611–7623.
[61]
Hongda Wu and Ping Wang. 2021. Fast-convergent federated learning with adaptive weighting. IEEE Transactions on Cognitive Communications and Networking 7, 4 (2021), 1078–1088.
[62]
Qiong Wu, Kaiwen He, and Xu Chen. 2020. Personalized federated learning for intelligent IoT applications: A cloud-edge based framework. IEEE Open Journal of the Computer Society 1 (2020), 35–44.
[63]
Weitao Xu, Jin Zhang, Jun Young Kim, Walter Huang, Salil S. Kanhere, Sanjay K. Jha, and Wen Hu. 2019. The design, implementation, and deployment of a smart lighting system for smart buildings. IEEE Internet of Things Journal 6, 4 (2019), 7266–7281.
[64]
Honglin Yuan and Tengyu Ma. 2020. Federated accelerated stochastic gradient descent. Advances in Neural Information Processing Systems 33 (2020), 5332–5344.
[65]
Mikhail Yurochkin, Mayank Agarwal, Soumya Ghosh, Kristjan Greenewald, Nghia Hoang, and Yasaman Khazaeni. 2019. Bayesian nonparametric federated learning of neural networks. In Proceedings of the International Conference on Machine Learning. 7252–7261.
[66]
Edvin Listo Zec, John Martinsson, Olof Mogren, Leon René Sütfeld, and Daniel Gillblad. 2020. Federated learning using mixture of experts. In Proceedings of the International Conference on Learning Representations (ICLR ’20).
[67]
Lin Zhang, Li Shen, Liang Ding, Dacheng Tao, and Ling-Yu Duan. 2022. Fine-tuning global model via data-free knowledge distillation for non-IID federated learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 10174–10183.
[68]
Michael Zhang, Karan Sapra, Sanja Fidler, Serena Yeung, and Jose M. Alvarez. 2020. Personalized federated learning with first order model optimization. arXiv preprint arXiv:2012.08565 (2020).
[69]
Bolei Zhou, Yiyou Sun, David Bau, and Antonio Torralba. 2018. Revisiting the importance of individual units in CNNs via ablation. arXiv preprint arXiv:1806.02891 (2018).

Cited By

View all
  • (2024)A Novel Multi-view Hypergraph Adaptive Fusion Approach for Representation LearningProceedings of the Third International Workshop on Social and Metaverse Computing, Sensing and Networking10.1145/3698387.3700000(43-49)Online publication date: 4-Nov-2024

Index Terms

  1. Neural-aware Decoupling Fusion based Personalized Federated Learning for Intelligent Sensing

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Sensor Networks
    ACM Transactions on Sensor Networks  Volume 20, Issue 6
    November 2024
    422 pages
    EISSN:1550-4867
    DOI:10.1145/3613636
    • Editor:
    • Wen Hu
    Issue’s Table of Contents

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Journal Family

    Publication History

    Published: 22 November 2024
    Online AM: 30 September 2024
    Accepted: 04 September 2024
    Revised: 11 July 2024
    Received: 06 November 2023
    Published in TOSN Volume 20, Issue 6

    Check for updates

    Author Tags

    1. Personalized federated learning (PFL)
    2. Internet of Things (IoT)
    3. intelligent sensing
    4. model fusion

    Qualifiers

    • Research-article

    Funding Sources

    • National Natural Science Foundation of China
    • A3 Foresight Program of NSFC
    • National Key R&D Program of China

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)154
    • Downloads (Last 6 weeks)19
    Reflects downloads up to 13 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)A Novel Multi-view Hypergraph Adaptive Fusion Approach for Representation LearningProceedings of the Third International Workshop on Social and Metaverse Computing, Sensing and Networking10.1145/3698387.3700000(43-49)Online publication date: 4-Nov-2024

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Full Text

    View this article in Full Text.

    Full Text

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media