Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3635059.3635069acmotherconferencesArticle/Chapter ViewAbstractPublication PagespciConference Proceedingsconference-collections
research-article
Open access

Recommender Systems based on Parallel and Distributed Deep Learning

Published: 14 February 2024 Publication History

Abstract

As individuals have become overloaded with information, Recommender Systems (RS) were created to provide machine generated recommendations. Significant advancements in RS have been made thanks to Machine Learning methods; Deep Learning (DL) in particular has become extremely popular. Despite the fact that Deep neural networks (DNNs) upgrade notably the performance of RS, they make them larger and more memory-intensive systems. To that end, the solution is adding (data or model) parallel and distributed algorithms to DL RS. In this paper, we present our large-scale, multi-staged, hybrid RS that processes a million-scale dataset, as well as the most noteworthy parallel or/and distributed DL systems. Finally, we outline directions regarding the future evolution of our RS by adding some features and ideas from such systems.

References

[1]
Meshal Alfarhood and Jianlin Cheng. 2020. CATA++: A Collaborative Dual Attentive Autoencoder Method for Recommending Scientific Articles. IEEE Access 8 (2020), 183633–183648. https://doi.org/10.1109/ACCESS.2020.3029722
[2]
Shaokang Cai, Dezhi Han, Xinming Yin, Dun Li, and Chin Chen Chang. 2022. A Hybrid parallel deep learning model for efficient intrusion detection based on metric learning. Connection Science 34 (2022), 551 – 577. https://api.semanticscholar.org/CorpusID:248301883
[3]
Jason Jinquan Dai, Yiheng Wang, Xin Qiu, Ding Ding, Yao Zhang, Yanzhang Wang, Xianyan Jia, Cherry Li Zhang, Yan Wan, Zhichao Li, Jiao Wang, Shengsheng Huang, Zhongyuan Wu, Yang Wang, Yuhao Yang, Bowen She, Dongjie Shi, Qiaoling Lu, Kai Huang, and Guoqiong Song. 2018. BigDL: A Distributed Deep Learning Framework for Big Data. Proceedings of the ACM Symposium on Cloud Computing (2018). https://api.semanticscholar.org/CorpusID:241641167
[4]
Aminu Da’u and Naomie Salim. 2020. Recommendation system based on deep learning methods: a systematic review and new directions. Artificial Intelligence Review 53 (2020), 2709–2748. https://api.semanticscholar.org/CorpusID:199408680
[5]
Aminu Da’u, Naomie Salim, and Rabiu Idris. 2021. An adaptive deep learning method for item recommendation system. Knowledge-Based Systems 213 (2021), 106681. https://doi.org/10.1016/j.knosys.2020.106681
[6]
Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2019. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In North American Chapter of the Association for Computational Linguistics. https://api.semanticscholar.org/CorpusID:52967399
[7]
Hakan Gunduz. 2019. Deep Learning-Based Parkinson’s Disease Classification Using Vocal Feature Sets. IEEE Access 7 (2019), 115540–115551. https://api.semanticscholar.org/CorpusID:201814047
[8]
Hanafi, Eli Pujastuti, Arif Dwi Laksito, Richki Hardi, Rifki Indra Perwira, Arief Arfriandi, and Asroni. 2022. Handling Sparse Rating Matrix for E-commerce Recommender System Using Hybrid Deep Learning Based on LSTM, SDAE and Latent Factor. International Journal of Intelligent Engineering and Systems (2022). https://api.semanticscholar.org/CorpusID:247162448
[9]
Xiangnan He, Lizi Liao, Hanwang Zhang, Liqiang Nie, Xia Hu, and Tat-Seng Chua. 2017. Neural Collaborative Filtering. Proceedings of the 26th International Conference on World Wide Web (2017). https://api.semanticscholar.org/CorpusID:13907106
[10]
Yanping Huang, Yonglong Cheng, Dehao Chen, HyoukJoong Lee, Jiquan Ngiam, Quoc V. Le, and Z. Chen. 2018. GPipe: Efficient Training of Giant Neural Networks using Pipeline Parallelism. In Neural Information Processing Systems. https://api.semanticscholar.org/CorpusID:53670168
[11]
Dhiraj D. Kalamkar, Evangelos Georganas, Sudarshan M. Srinivasan, Jianping Chen, Mikhail Shiryaev, and Alexander Heinecke. 2020. Optimizing Deep Learning Recommender Systems Training on CPU Cluster Architectures. SC20: International Conference for High Performance Computing, Networking, Storage and Analysis (2020), 1–15. https://api.semanticscholar.org/CorpusID:218581703
[12]
Chiheon Kim, Heungsub Lee, Myungryong Jeong, Woonhyuk Baek, Boogeon Yoon, Ildoo Kim, Sungbin Lim, and Sungwoong Kim. 2020. torchgpipe: On-the-fly Pipeline Parallelism for Training Giant Models. arxiv:2004.09910 [cs.DC]
[13]
Farley Lai, Asim Kadav, and Erik Kruus. 2021. SplitBrain: Hybrid Data and Model Parallel Deep Learning. ArXiv abs/2112.15317 (2021). https://api.semanticscholar.org/CorpusID:245634190
[14]
Seungyeon Lee and Dohyun Kim. 2022. Deep learning based recommender system using cross convolutional filters. Information Sciences 592 (2022), 112–122. https://doi.org/10.1016/j.ins.2022.01.033
[15]
Shen Li, Yanli Zhao, Rohan Varma, Omkar Salpekar, Pieter Noordhuis, Teng Li, Adam Paszke, Jeff Smith, Brian Vaughan, Pritam Damania, and Soumith Chintala. 2020. PyTorch distributed. Proceedings of the VLDB Endowment 13 (2020), 3005 – 3018. https://api.semanticscholar.org/CorpusID:220250008
[16]
Xupeng Miao, Xiaonan Nie, Hailin Zhang, Tong Zhao, and Bin Cui. 2022. Hetu: a highly efficient automatic parallel distributed deep learning system. Science China Information Sciences 66 (2022). https://api.semanticscholar.org/CorpusID:254152633
[17]
Kabir Nagrecha. 2021. Model-Parallel Model Selection for Deep Learning Systems. Proceedings of the 2021 International Conference on Management of Data (2021). https://api.semanticscholar.org/CorpusID:235474499
[18]
Maxim Naumov, Dheevatsa Mudigere, Hao-Jun Michael Shi, Jianyu Huang, Narayanan Sundaraman, Jongsoo Park, Xiaodong Wang, Udit Gupta, Carole-Jean Wu, Alisson G. Azzolini, Dmytro Dzhulgakov, Andrey Mallevich, Ilia Cherniavskii, Yinghai Lu, Raghuraman Krishnamoorthi, Ansha Yu, Volodymyr Kondratenko, Stephanie Pereira, Xianjie Chen, Wenlin Chen, Vijay Rao, Bill Jia, Liang Xiong, and Mikhail Smelyanskiy. 2019. Deep Learning Recommendation Model for Personalization and Recommendation Systems. ArXiv abs/1906.00091 (2019). https://api.semanticscholar.org/CorpusID:173990641
[19]
Daniel Nichols, Siddharth Singh, Shuqing Lin, and Abhinav Bhatele. 2021. A Survey and Empirical Evaluation of Parallel Deep Learning Frameworks. https://api.semanticscholar.org/CorpusID:250244145
[20]
Newlin Shebiah Russel and Arivazhagan Selvaraj. 2022. Leaf species and disease classification using multiscale parallel deep CNN architecture. Neural Computing and Applications 34 (2022), 19217 – 19237. https://api.semanticscholar.org/CorpusID:250299166
[21]
Qusai Y. Shambour. 2021. A deep learning based algorithm for multi-criteria recommender systems. Knowl. Based Syst. 211 (2021), 106545. https://api.semanticscholar.org/CorpusID:228873949
[22]
Shaohuai Shi, Xiaowen Chu, and Bo Li. 2021. Exploiting Simultaneous Communications to Accelerate Data Parallel Distributed Deep Learning. IEEE INFOCOM 2021 - IEEE Conference on Computer Communications (2021), 1–10. https://api.semanticscholar.org/CorpusID:235273065
[23]
Mohammad Shoeybi, Mostofa Patwary, Raul Puri, Patrick LeGresley, Jared Casper, and Bryan Catanzaro. 2019. Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism. ArXiv abs/1909.08053 (2019). https://api.semanticscholar.org/CorpusID:202660670
[24]
Chijun Sima, Yao Fu, Man-Kit Sit, Liyi Guo, Xuri Gong, Feng Lin, Junyu Wu, Yongsheng Li, Haidong Rong, Pierre-Louis Aublin, and Luo Mai. 2022. Ekko: A Large-Scale Deep Learning Recommender System with Low-Latency Model Update. In 16th USENIX Symposium on Operating Systems Design and Implementation (OSDI 22). USENIX Association, Carlsbad, CA, 821–839. https://www.usenix.org/conference/osdi22/presentation/sima
[25]
Vaios Stergiopoulos, Michael Vassilakopoulos, Eleni Tousidou, and Antonio Corral. 2022. Hyper-parameters Tuning of Artificial Neural Networks: An Application in the Field of Recommender Systems. In New Trends in Database and Information Systems - ADBIS 2022 Proceedings(Communications in Computer and Information Science, Vol. 1652). Springer, 266–276. https://doi.org/10.1007/978-3-031-15743-1_25
[26]
Vaios Stergiopoulos, Michael Vassilakopoulos, Eleni Tousidou, and Antonio Corral. 2023. An Academic Recommender System on large citation data based on clustering, graph-modeling and deep-learning. Submitted manuscript for publication (2023).
[27]
Jie Tang, Jing Zhang, Limin Yao, Juanzi Li, Li Zhang, and Zhong Su. 2008. ArnetMiner: extraction and mining of academic social networks. In Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Las Vegas, Nevada, USA, August 24-27, 2008. ACM, 990–998. https://doi.org/10.1145/1401890.1402008
[28]
Meirui Wang, Pengjie Ren, Lei Mei, Zhumin Chen, Jun Ma, and M. de Rijke. 2019. A Collaborative Session-based Recommendation Approach with Parallel Memory Modules. Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval (2019). https://api.semanticscholar.org/CorpusID:197645898

Cited By

View all

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
PCI '23: Proceedings of the 27th Pan-Hellenic Conference on Progress in Computing and Informatics
November 2023
304 pages
This work is licensed under a Creative Commons Attribution International 4.0 License.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 February 2024

Check for updates

Author Tags

  1. deep learning
  2. distributed systems
  3. machine learning
  4. parallel systems
  5. recommender systems

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

  • MCIN and the European Union

Conference

PCI 2023

Acceptance Rates

Overall Acceptance Rate 190 of 390 submissions, 49%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 284
    Total Downloads
  • Downloads (Last 12 months)284
  • Downloads (Last 6 weeks)25
Reflects downloads up to 03 Feb 2025

Other Metrics

Citations

Cited By

View all

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media