Nothing Special   »   [go: up one dir, main page]

Skip to main content

A New Multilayer Perceptron Initialisation Method with Selection of Weights on the Basis of the Function Variability

  • Conference paper
Artificial Intelligence and Soft Computing (ICAISC 2014)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 8467))

Included in the following conference series:

Abstract

Learning results of multilayer perceptrons highly depend on the initial weight values. The proper selection of weights may improve network performance and reduce time of the learning process. In the paper, a new multilayer perceptron weight selection algorithm based on determination of the variability of the approximated function, within various fragments of its domain, has been proposed. This algorithm has a low computational complexity. Results of numerical experiments have been presented for many learning sets. The comparison of cost function values for neural networks initialized with the applying of the proposed algorithm and for networks initialised by the popular Nguyen-Widrow algorithm has been shown. Independently of the epoch number, the use of the proposed algorithm made it possible to achieve better results for a vast majority of the learning sets.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Nelles, O.: Summary of 11th chapter. In: Nelles, O. (ed.) Nonlinear System Identification: From Classical Approaches to Neural Networks and Fuzzy Models, pp. 296–297. Springer, Berlin (2001)

    Chapter  Google Scholar 

  2. Ebert, T., Bänfer, O., Nelles, O.: Multilayer Perceptron Network with Modified Sigmoid Activation Functions. In: Wang, F.L., Deng, H., Gao, Y., Lei, J. (eds.) AICI 2010. LNCS, vol. 6319, pp. 414–421. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  3. Maniezzo, V.: Genetic evolution of the topology and weight distribution of the neural networks. Neural Networks 5(1), 39–53 (1994)

    Article  Google Scholar 

  4. Guijarro-Berdiñas, B., Fontenla-Romero, O., Pérez-Sánchez, B., Alonso-Betanzos, A.: A new initialization method for neural networks using sensitivity analysis. In: Proceedings of the International Conference on Mathematical and Statistical Modeling in Honor of Enrique Castillo, San Diego (2006)

    Google Scholar 

  5. Wei, X., Xiu-Tao, Y.: The application of optimal weights initialization algorithm based on information amount in multi-layer perceptron networks. In: Proceedings of the 3rd IEEE International Conference on Computer Science and Information Technology (ICCSIT), vol. 6, pp. 196–198 (2010)

    Google Scholar 

  6. Cho, S., Chow, T.W.S.: Training multilayer neural networks using fast global learning algorithm - least-squares and penalized optimization methods. Proceedings of Neurocomputing 25, 115–131 (1999)

    Article  MATH  Google Scholar 

  7. Jin-Song, P., Mai, E.C., Wright, J.P., Smyth, A.W.: Neural network initialization with prototypes - function approximation in engineering mechanics applications. In: Proceedings of the International Joint Conference on Neural Networks, Orlando, pp. 2110–2116 (2007)

    Google Scholar 

  8. Erdogmus, D., Fontenla-Romero, O., Principe, J.C., Alonso-Betanzos, A., Castillo, E.: Linear-least-squares initialization of multilayer perceptrons through backpropagation of the desired response. IEEE Transactions on Neural Networks 16(2), 325–327 (2005)

    Article  Google Scholar 

  9. Nguyen, D., Widrow, B.: Improving the learning speed of 2-Layer neural networks by choosing initial values of the adaptive weights. In: Proceedings of the International Joint Conference on Neural Networks, vol. 3, pp. 21–26 (1990)

    Google Scholar 

  10. Pavelka, A., Procházka: Algorithms for Initialization of Neural Network Weights, Sbornik prispevku 11. In: Konference MATLAB, vol. 2, pp. 453–459 (2004)

    Google Scholar 

  11. van de Laar, P., Heskes, T., Gielen, S.: Partial retraining: a new approach to input relevance determination. International Journal of Neural Systems 9(1), 75–85 (1999)

    Article  MATH  Google Scholar 

  12. Online documentation of GTOPO30 data (2012), http://eros.usgs.gov/#/Find_Data/Products_and_Data_Available/gtopo30/README

  13. UCI Machine Learning Repository, http://archive.ics.uci.edu/ml

  14. Cortez, P., Morais, A.: A data mining approach to predict forest fires using meteorological data. In: Neves, J., Santos, M.F., Machado, J. (eds.) New Trends in Artificial Intelligence, Proceedings of the 13th EPIA - Portuguese Conference on Artificial Intelligence, pp. 512–523. APPIA, Guimarães (2007)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Halawa, K. (2014). A New Multilayer Perceptron Initialisation Method with Selection of Weights on the Basis of the Function Variability. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds) Artificial Intelligence and Soft Computing. ICAISC 2014. Lecture Notes in Computer Science(), vol 8467. Springer, Cham. https://doi.org/10.1007/978-3-319-07173-2_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-07173-2_5

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-07172-5

  • Online ISBN: 978-3-319-07173-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics