Abstract
In recent years, deep learning models have achieved outstanding results in computer vision and speech recognition. And now deep learning are effectively being exploited to address some major issues of Big Data, including fast information retrieval, language translation, data classification and so on. In this work, we implemented news article headline generation application for performance analysis of our framework, Spark-DLF. Training deep learning models requires extensive data and computation. Our proposed framework can accelerate the training time by distributing the model replicas, via stochastic gradient descent, among cluster nodes. We conducted a performance analysis to see how well our framework can handle Big Data problems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Akhmedov, K., Yun, C., Hanku, L.: Spark based distributed deep learning framework for big data applications. In: International Conference on Information Science and Communications Technologies. IEEE (2016). https://doi.org/10.1109/ICISCT.2016.7777390
He, K., et al.: Deep residual learning for image recognition (2015). arXiv:1512.03385
Chorowski, J.K., et al.: Attention-based models for speech recognition. In: NIPS, pp. 577–585 (2015)
Bowman, S.R., Angeli, G., Potts, Ch.: A large annotated corpus for learning natural language inference. In: Conference on Empirical Methods in Natural Language Processing (2015)
Szegedy, C., et al.: Going deeper with convolutions. In: CVPR, pp. 1–9 (2015)
Krizhevsky, A., et al.: Imagenet classification with deep convolutional neural networks. In: NIPS, pp. 1097–1105 (2012)
Chetlur, S., et al.: cudnn: Efficient primitives for deep learning (2014). arXiv preprint: arXiv:1410.0759
Krizhevsky, A.: One weird trick for parallelizing convolutional neural networks (2014). arXiv preprint: arXiv:1404.5997
Li, M., et al.: Scaling distributed machine learning with the parameter server. In: OSDI, pp. 583–598 (2014)
Dean, J., et al.: Large scale distributed deep networks. In: NIPS, pp. 1223–1231 (2012)
Philipp, M., Robert, N., Ion, S., Micheal, J.: SparkNet: training deep networks in spark (2015)
Acknowledgements
This paper was written as part of Konkuk University’s research support program for its faculty on sabbatical leave in 2017 and was supported by Institute for Information & communications Technology Promotion (IITP) grant funded by the Korea government (MSIP) (No. R0113-16-0008, Development of SaaS Aggregation Technology for Cloud Service Mashup).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Khumoyun, A., Cui, Y., Kim, M., Hanku, L. (2018). Performance Analysis of Spark-DLF: Spark Based Distributed Deep Learning Framework for Article Headline Generation. In: Park, J., Loia, V., Yi, G., Sung, Y. (eds) Advances in Computer Science and Ubiquitous Computing. CUTE CSA 2017 2017. Lecture Notes in Electrical Engineering, vol 474. Springer, Singapore. https://doi.org/10.1007/978-981-10-7605-3_8
Download citation
DOI: https://doi.org/10.1007/978-981-10-7605-3_8
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-7604-6
Online ISBN: 978-981-10-7605-3
eBook Packages: EngineeringEngineering (R0)