Abstract
Deep neural network has been successfully used in various fields, and it has received significant results in some typical tasks, especially in computer vision. However, deep neural network are usually trained by using gradient descent based algorithm, which results in gradient vanishing and gradient explosion problems. And it requires expert level professional knowledge to design the structure of the deep neural network and find the optimal hyper parameters for a given task. Consequently, training a deep neural network becomes a very time consuming problem. To overcome the shortcomings mentioned above, we present a model which combining Gabor filter and pseudoinverse learning autoencoders. The method referred in model optimization is a non-gradient descent algorithm. Besides, we presented the empirical formula to set the number of hidden neurons and the number of hidden layers in the entire training process. The experimental results show that our model is better than existing benchmark methods in speed, at same time it has the comparative recognition accuracy also.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
LeCun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (1998)
Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, pp. 1097–1105 (2012)
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
Karen, S., Andrew, Z.: Very deep convolutional networks for large-scale image recognition. ArXiv:1409.1556[cs.CV] (2014)
Tai, S.L.: Image representation using 2D Gabor wavelets. IEEE Trans. Pattern Anal. Mach. Intell. 18(10), 959–971 (1996)
Wang, K., Guo, P., Yin, Q., et al.: A pseudoinverse incremental algorithm for fast training deep neural networks with application to spectra pattern recognition. In: 2016 International Joint Conference on Neural Networks (IJCNN), pp. 3453–3460. IEEE (2016)
Feng, S., Li, S., Guo, P., Yin, Q.: Image recognition with histogram of oriented gradient feature and pseudoinverse learning autoencoders. In: 24th International Conference on Neural Information Processing (ICONIP 2017), pp. 740–749. Springer, Cham (2017)
Gabor, D.: Theory of communication. J. Inst. Electr. Eng. I Gen. 93(26), 429–441 (1946)
Daugman, J.D.: Two dimensional spectral analysis of cortical receptive field profiles. Vision Res. 20(10), 847–856 (1980)
Jones, J., Palmer, L.: An evaluation of the two-dimensional Gabor filter model of simple receptive fields in cat striate cortex. J. Neurophysiol. 58(6), 1233–1258 (1987)
Kruizinga, P., Petkov, N.: Nonlinear operator for oriented texture. IEEE Trans. Image Process. 8(10), 1395–1407 (1999)
Fazli, S., Afrouzian, R., Seyedarabi, H.: High-performance facial expression recognition using gabor filter and probabilistic neural network. In: 2009 IEEE International Conference on Intelligent Computing and Intelligent Systems, pp. 93–96 (2009)
Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature 323(6088), 533–536 (1986)
Hinton, G.E., Osindero, S., Teh, Y.W.: A fast learning algorithm for deep belief nets. Neural Comput. 18(7), 1527–1554 (2006)
Guo, P., Chen, P.C.L., Sun, Y.: An exact supervised learning for a three-layer supervised neural network. In: Second International Conference on Neural Information Processing (ICONIP 1995), pp. 1041–1044 (1995)
Guo, P., Lyu, M.R., Mastorakis, N.E.: Pseudoinverse learning algorithm for feedforward neural networks. In: Advances in Neural Networks and Applications, pp. 321–326 (2001)
Guo, P., Lyu, M.R.: A pseudoinverse learning algorithm for feedforward neural networks with stacked generalization applications to software reliability growth data. Neurocomputing 56, 101–121 (2004)
Wang, K., Guo, P., Xin, X., Ye, Z.: Autoencoder, low rank approximation and pseudoinverse learning algorithm. In: 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 948–953. IEEE (2017)
Guo, P., Lyu, M., Chen, P.: Regularization parameter estimation for feedforward neural networks. IEEE Trans. Syst. Man Cybern. B 33(1), 35–44 (2003)
Guo, P.: A VEST of the pseudoinverse learning algorithm. Preprint arXiv:1805.07828 (2018)
Acknowledgements
The research work described in this paper was fully supported by the grants from the National Natural Science Foundation of China (Project No. 61472043), the Joint Research Fund in Astronomy (U1531242) under cooperative agreement between the NSFC and CAS, and Natural Science Foundation of Shandong (ZR2015FL006). Prof. Ping Guo and Qian Yin are the authors to whom all correspondence should be addressed.
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Deng, X., Feng, S., Guo, P., Yin, Q. (2018). Fast Image Recognition with Gabor Filter and Pseudoinverse Learning AutoEncoders. In: Cheng, L., Leung, A., Ozawa, S. (eds) Neural Information Processing. ICONIP 2018. Lecture Notes in Computer Science(), vol 11306. Springer, Cham. https://doi.org/10.1007/978-3-030-04224-0_43
Download citation
DOI: https://doi.org/10.1007/978-3-030-04224-0_43
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-04223-3
Online ISBN: 978-3-030-04224-0
eBook Packages: Computer ScienceComputer Science (R0)