Abstract
Recently, deep neural networks have demonstrated their efficiency in image classification tasks, which are commonly achieved by an extended depth and width of network architecture. However, poor convergence, over-fitting and gradient disappearance might be generated with such comprehensive architectures. Therefore, DenseNet is developed to address these problems. Although DenseNet adopts bottleneck technique in DenseBlocks to avoid relearning feature-maps and decrease parameters, this operation may lead to the skip and loss of important features. Besides, it still takes oversized computational power when the depth and width of the network architecture are increased for better classification. In this paper, we propose a variate of DenseNet, named Multipath Feature Recalibration DenseNet (MFR-DenseNet), to stack convolution layers instead of adopting bottleneck for improving feature extraction. Meanwhile, we build multipath DenseBlocks with Squeeze-Excitation (SE) module to represent the interdependencies of useful feature-maps among different DenseBlocks. Experiments in CIFAR-10, CIFAR-100, MNIST and SVHN reveal the efficiency of our network, with further reduced redundancy whilst maintaining the high accuracy of DenseNet.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. Neural information processing systems. Springer, New York, pp 1097–1105
Simonyan K, Zisserman A (2015) Very deep convolutional networks for large-scale image recognition. In: International conference on learning representations (ICLR)
Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, Erhan D, Vanhoucke V, Rabinovich A (2015) Going deeper with convolutions. In: Proceedings of the IEEE conference on computer vision and pattern recognition(CVPR), pp 1–9
Srivastava RK, Greff K, Schmidhuber J (2015) Training very deep networks. Advances in neural information processing systems. Springer, New York, pp 2377–2385
He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR), pp 770–778
Hochreiter S (1998) The vanishing gradient problem during learning recurrent neural nets and problem solutions. Int J Uncertain Fuzziness Knowl Based Syst 06(02):107–116
Bengio Y (2002) Learning long-term dependencies with gradient descent is difficult. IEEE Trans Neural Netw Learn Syst 05(02):157–166
Huang G, Liu Z, Laurens VDM, Weinberger KQ (2017) Densely connected convolutional networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR), pp 4700–4708
Liu W, Zeng K (2018) Sparsenet: a sparse DenseNet for image classification, arXiv. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR)
Zhang K, Guo Y, Wang X, Yuan J, Ding Q (2019) Multiple feature reweight DenseNet for image classification. IEEE Access 7:9872–9880
Zhang Z, Liang X, Dong X, Xie Y, Cao G (2018) A sparse-view CT reconstruction method based on combination of DenseNet and deconvolution. IEEE Trans Med Imaging 37(6):1407–1417
Chen Y, Li J, Xiao H, Jin X, Yan S, Feng J (2017) Dual path networks. In: Neural information processing systems (NeurIPS), pp 4467–4475
Lodhi B, Kang J (2019) Multipath-densenet: a supervised ensemble architecture of densely connected convolutional networks. Inf Sci 482:63–72
Huang G, Liu S, Der Maaten LV, Weinberger KQ (2018) CondenseNet: an efficient DenseNet using learned group convolutions. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR), pp 2752–2761
Ioffe S, Szegedy C (2015) Batch normalization: accelerating deep network training by reducing internal covariate shift. In: International conference on machine learning (ICML), pp 448–456
Glorot X, Bordes A, Bengio Y (2011) Deep sparse rectifier neural networks. In: International conference on artificial intelligence and statistics, pp 315–323
Lecun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324
Hu J, Shen L, Sun G (2018) Squeeze-and-excitation networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR), pp 7132–7141
Krizhevsky A, Hinton G (2009) Learning multiple layers of features from tiny images, Tech Report
Deng L (2012) The MNIST database of handwritten digit images for machine learning research [best of the web]. IEEE Signal Process Mag 29(6):141–142
Netzer Y, Wang T, Coates A, Bissacco A, Wu B, Ng AY (2011) Reading digits in natural images with unsupervised feature learning. In: NIPS workshop on deep learning and unsupervised feature learning 2011, [Online]. http://ufldl.stanford.edu/housenumbers/nips2011_housenumbers.pdf
He K, Zhang X, Ren S, Sun J (2015) Delving deep into rectifiers: surpassing human-level performance on imagenet classification. In: The IEEE international conference on computer vision (ICCV), pp 1026–1034
Lin M, Chen Q, Yan S (2014) Network in network. In: International conference on learning representations (ICLR)
Larsson G, Maire M, Shakhnarovich G (2017) FractalNet: ultra-deep neural networks without residuals. In: International conference on learning representations (ICLR)
Xie S, Girshick R, Dollar P, Tu Z, He K (2017) Aggregated residual transformations for deep neural networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR), pp 5987–5995
Zagoruyko S, Komodakis N (2016) Wide residual networks. In: Proceedings of the British Machine Vision Conference (BMVC), pp 87.1–87.12
Sandler M, Howard A, Zhu M, Zhmoginov A, Chen LC (2019) Grading of hepatocellular carcinoma using 3D SE-DenseNet in dynamic enhanced MR images. Comput Biol Med 107:45–47
Funding
This research is supported by the National Natural Science Foundation of China (Grant 61671152, 61901119).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This research is supported by the National Natural Science Foundation of China (grant 61671152, 61901119).
Rights and permissions
About this article
Cite this article
Chen, B., Zhao, T., Liu, J. et al. Multipath feature recalibration DenseNet for image classification. Int. J. Mach. Learn. & Cyber. 12, 651–660 (2021). https://doi.org/10.1007/s13042-020-01194-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13042-020-01194-4