Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3428662.3428792acmconferencesArticle/Chapter ViewAbstractPublication PagesmiddlewareConference Proceedingsconference-collections
research-article

Decentralized machine learning using compressed push-pull averaging

Published: 04 January 2021 Publication History

Abstract

For decentralized learning algorithms communication efficiency is a central issue. On the one hand, good machine learning models require more and more parameters. On the other hand, there is a relatively large cost for transferring data via P2P channels due to bandwidth and unreliability issues. Here, we propose a novel compression mechanism for P2P machine learning that is based on the application of stateful codecs over P2P links. In addition, we also rely on transfer learning for extra compression. This means that we train a relatively small model on top of a high quality pre-trained feature set that is fixed. We demonstrate these contributions through an experimental analysis over a real smartphone trace.

References

[1]
Davide Anguita, Alessandro Ghio, Luca Oneto, Xavier Parra, and Jorge Luis Reyes-Ortiz. 2013. A public domain dataset for human activity recognition using smartphones. In Esann, Vol. 3. 3.
[2]
K. Bache and M. Lichman. 2013. UCI Machine Learning Repository.
[3]
Árpád Berta, Vilmos Bilicki, and Márk Jelasity. 2014. Defining and Understanding Smartphone Churn over the Internet: a Measurement Study. In Proceedings of the 14th IEEE International Conference on Peer-to-Peer Computing (P2P 2014) (London, UK). IEEE.
[4]
Ella Bingham and Heikki Mannila. 2001. Random projection in dimensionality reduction: applications to image and text data. In Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining. 245--250.
[5]
S. Buckingham Shum, K. Aberer, A. Schmidt, S. Bishop, P. Lukowicz, S. Anderson, Y. Charalabidis, J. Domingue, S. Freitas, I. Dunwell, B. Edmonds, F. Grey, M. Haklay, M. Jelasity, A. Karpištšenko, J. Kohlhammer, J. Lewis, J. Pitt, R. Sumner, and D. Helbing. 2012. Towards a global participatory platform. The European Physical Journal Special Topics 214, 1 (2012), 109--152.
[6]
Ruggero Carli, Fabio Fagnani, Paolo Frasca, and Sandro Zampieri. 2010. Gossip consensus algorithms via quantized communication. Automatica 46, 1 (2010), 70--80.
[7]
Gábor Danner and Márk Jelasity. 2018. Robust decentralized mean estimation with limited communication. In European Conference on Parallel Processing. Springer, 447--461.
[8]
Gábor Danner and Márk Jelasity. 2018. Token account algorithms: the best of the proactive and reactive worlds. In 2018 IEEE 38th International Conference on Distributed Computing Systems (ICDCS). IEEE, 885--895.
[9]
M. Fu and L. Xie. 2009. Finite-Level Quantized Feedback Control for Linear Systems. IEEE Trans. Automat. Control 54, 5 (2009), 1165--1170.
[10]
István Hegedűs, Gábor Danner, and Márk Jelasity. 2019. Gossip learning as a decentralized alternative to federated learning. In IFIP International Conference on Distributed Applications and Interoperable Systems. Springer, 74--90.
[11]
István Hegedűs, Gábor Danner, and Márk Jelasity. 2021. Decentralized Learning Works: An Empirical Comparison of Gossip Learning and Federated Learning. manuscript under review.
[12]
Anastasia Koloskova, Sebastian Stich, and Martin Jaggi. 2019. Decentralized Stochastic Optimization and Gossip Algorithms with Compressed Communication. In Proceedings of the 36th International Conference on Machine Learning (Proceedings of Machine Learning Research, Vol. 97), Kamalika Chaudhuri and Ruslan Salakhutdinov (Eds.). PMLR, Long Beach, California, USA, 3478--3487.
[13]
Jakub Konecný, H. Brendan McMahan, Felix X. Yu, Peter Richtárik, Ananda Theertha Suresh, and Dave Bacon. 2016. Federated Learning: Strategies for Improving Communication Efficiency. In Private Multi-Party Machine Learning (NIPS 2016 Workshop).
[14]
Yann Lecun, Léon Bottou, Yoshua Bengio, and Patrick Haffner. 1998. Gradient-Based Learning Applied to Document Recognition. Proc. of the IEEE 86, 11 (Nov. 1998), 2278--2324.
[15]
T. Li, M. Fu, L. Xie, and J. F. Zhang. 2011. Distributed Consensus With Limited Communication Data Rate. IEEE Trans. Automat. Control 56, 2 (2011), 279--292.
[16]
Brendan McMahan, Eider Moore, Daniel Ramage, Seth Hampson, and Blaise Aguera y Arcas. 2017. Communication-Efficient Learning of Deep Networks from Decentralized Data. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (Proceedings of Machine Learning Research, Vol. 54), Aarti Singh and Jerry Zhu (Eds.). PMLR, Fort Lauderdale, FL, USA, 1273--1282.
[17]
G. N. Nair, F. Fagnani, S. Zampieri, and R. J. Evans. 2007. Feedback Control Under Data Rate Constraints: An Overview. Proc. IEEE 95, 1 (2007), 108--137.
[18]
Róbert Ormándi, István Hegedűs, and Márk Jelasity. 2013. Gossip learning with linear models on fully distributed data. Concurrency and Computation: Practice and Experience 25, 4 (2013), 556--571.
[19]
H. Shin, H. R. Roth, M. Gao, L. Lu, Z. Xu, I. Nogues, J. Yao, D. Mollura, and R. M. Summers. 2016. Deep Convolutional Neural Networks for Computer-Aided Detection: CNN Architectures, Dataset Characteristics and Transfer Learning. IEEE Transactions on Medical Imaging 35, 5 (2016), 1285--1298.
[20]
Ananda Theertha Suresh, Felix X. Yu, Sanjiv Kumar, and H. Brendan McMahan. 2017. Distributed Mean Estimation with Limited Communication. In Proc. 34th Intl. Conf. Machine Learning, (ICML). 3329--3337.
[21]
Ji Wang, Bokai Cao, Philip S. Yu, Lichao Sun, Weidong Bao, and Xi-aomin Zhu. 2018. Deep Learning towards Mobile Applications. In IEEE 38th International Conference on Distributed Computing Systems (ICDCS). 1385--1393.
[22]
Han Xiao, Kashif Rasul, and Roland Vollgraf. 2017. Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms. arXiv:cs.LG/1708.07747 [cs.LG]
[23]
M. Zhu and S. Martinez. 2011. On the Convergence Time of Asynchronous Distributed Quantized Averaging Algorithms. IEEE Trans. Automat. Control 56, 2 (2011), 386--390.

Cited By

View all
  • (2023)Improving Gossip Learning via Limited Model MergingAdvances in Computational Collective Intelligence10.1007/978-3-031-41774-0_28(351-363)Online publication date: 22-Sep-2023

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
DICG'20: Proceedings of the 1st International Workshop on Distributed Infrastructure for Common Good
December 2020
52 pages
ISBN:9781450381970
DOI:10.1145/3428662
© 2020 Association for Computing Machinery. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of a national government. As such, the Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 January 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. compressed communication
  2. decentralized averaging
  3. machine learning

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • Nemzeti Kutatási Fejlesztési és Innovációs Hivatal

Conference

Middleware '20
Sponsor:
Middleware '20: 21st International Middleware Conference
December 7 - 11, 2020
Delft, Netherlands

Upcoming Conference

MIDDLEWARE '24
25th International Middleware Conference
December 2 - 6, 2024
Hong Kong , Hong Kong

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)12
  • Downloads (Last 6 weeks)0
Reflects downloads up to 23 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2023)Improving Gossip Learning via Limited Model MergingAdvances in Computational Collective Intelligence10.1007/978-3-031-41774-0_28(351-363)Online publication date: 22-Sep-2023

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media