Abstract
Collaborative learning such as federated learning enables to train a global prediction model in a distributed way without the need to share the training data. However, most existing schemes adopt deep learning models and require all local models to have the same architecture as the global model, making them unsuitable for applications using resource- and bandwidth-hungry devices. In this paper, we present CloREF, a novel rule-based collaborative learning framework, that allows participating devices to use different local learning models. A rule extraction method is firstly proposed to bridge the heterogeneity of local learning models by approximating their decision boundaries. Then a novel rule fusion and selection mechanism is designed based on evolutionary optimization to integrate the knowledge learned by all local models. Experimental results on a number of synthesized and real-world datasets demonstrate that the rules generated by our rule extraction method can mimic the behaviors of various learning models with high fidelity (>0.95 in most tests), and CloREF gives comparable and sometimes even better AUC compared with the best-performing model trained centrally.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Alcalá-Fdez, J., et al.: Keel data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J. Multiple-Valued Log. Soft Comput. 17(2–3), 255–187 (2011). Citeseer
Arivazhagan, M.G., et al.: Federated learning with personalization layers. arXiv (2019)
Colin Cameron, A., et al.: An r-squared measure of goodness of fit for some common nonlinear regression models. J. Econom. 77(2), 329–342 (1997)
Diao, E., et al.: HeteroFL: Computation and communication efficient federated learning for heterogeneous clients. arXiv (2021)
Fallah, A., et al.: Personalized federated learning: a meta-learning approach. CoRR abs/2002.07948 (2020)
FedAI: FATE: an industrial grade federated learning framework. https://fate.fedai.org (2021)
Guidotti, R., et al.: A survey of methods for explaining black box models. ACM Comput. Surv. 51(5), 1–42 (2018)
Guo, W., et al.: LEMNA: explaining deep learning based security applications. In: Proceedings of the 2018 ACM SIGSAC Conference on Computer and Communications Security, pp. 364–379 (2018)
Konečný, J., et al.: Federated learning: strategies for improving communication efficiency. In: NIPS Workshop on Private Multi-Party Machine Learning (2016)
Li, D., Wang, J.: FedMD: heterogenous federated learning via model distillation. In: NeurIPS 2019 Workshop on Federated Learning for Data Privacy and Confidentiality (2019)
Li, T., et al.: Federated learning: challenges, methods, and future directions. IEEE Signal Process. Mag. 37(3), 50–60 (2020)
Liu, Y., et al.: A secure federated transfer learning framework. IEEE Intell. Syst. 35(4), 70–82 (2020)
McMahan, B., et al.: Communication-efficient learning of deep networks from decentralized data. In: International Conference on Artificial Intelligence and Statistics. Proceedings of Machine Learning Research, vol. 54, pp. 1273–1282. PMLR (2017)
Narendra, T., et al.: Explaining deep learning models using causal inference (2018)
Platt, J., et al.: Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods. Adv. Large Margin Classif. 10(3), 61–74 (1999)
Verbraeken, J., Wolting, M.: A survey on distributed machine learning. ACM Comput. Surv. 53(2), 1–33 (2020)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Pang, Y., Zhang, H., Deng, J.D., Peng, L., Teng, F. (2022). Rule-Based Collaborative Learning with Heterogeneous Local Learning Models. In: Gama, J., Li, T., Yu, Y., Chen, E., Zheng, Y., Teng, F. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2022. Lecture Notes in Computer Science(), vol 13280. Springer, Cham. https://doi.org/10.1007/978-3-031-05933-9_50
Download citation
DOI: https://doi.org/10.1007/978-3-031-05933-9_50
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-05932-2
Online ISBN: 978-3-031-05933-9
eBook Packages: Computer ScienceComputer Science (R0)