Nothing Special   »   [go: up one dir, main page]

Skip to main content

Evolving Artificial Neural Networks Using Adaptive Differential Evolution

  • Conference paper
Advances in Artificial Intelligence – IBERAMIA 2010 (IBERAMIA 2010)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 6433))

Included in the following conference series:

Abstract

One of the main problems in the training of artificial neural networks is to define their initial weights and architecture. The use of evolutionary algorithms (EAs) to optimize artificial neural networks has been largely used because the EAs can deal with large, non-differentiable, complex and multimodal spaces, and because they are good in finding the optimal region. In this paper we propose the use of Adaptive Differential Evolution (JADE), a new evolutionary algorithm based in the differential evolution (DE), to deal with this problem of training neural networks. Experiments were performed to evaluate the proposed method using machine learning benchmarks for classification problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Storn, R., Price, K.: Differential evolution- a simple and efficient adaptive scheme for global optimization over continuous spaces. Technical Report TR-95-012, International Computer Science Institute (March 1995)

    Google Scholar 

  2. Cai, H.R., Chung, C.Y., Wong, K.P.: Application of differential evolution algorithm for transient stability constrained optimal power flow. IEEE Transactions on Power Systems 23(2), 719–728 (2008)

    Article  Google Scholar 

  3. Zhang, J., Sanderson, A.C.: Jade: Adaptive differential evolution with optional external archive. IEEE Transactions on Evolutionary Computation 13(5), 945–958 (2009)

    Article  Google Scholar 

  4. Kim, H., Chong, J., Park, K., Lowther, D.A.: Differential evolution strategy for constrained global optimization and application to practical engineering problems. IEEE Transactions on Magnetics 43(4), 1565–1568 (2007)

    Article  Google Scholar 

  5. Das, S., Abraham, A., Konar, A.: Automatic clustering using an improved differential evolution algorithm. IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans 38(1), 218–237 (2008)

    Article  Google Scholar 

  6. Gämperle, R., Müller, S.D., Koumoutsakos, P.: A parameter study for differential evolution. In: Proc. Advances Intell. Syst., Fuzzy Syst., Evol. Comput., pp. 293–298 (2002)

    Google Scholar 

  7. Qin, A.K., Suganthan, P.N.: Self-adaptive differential evolution algorithm for numerical optimization. In: The 2005 IEEE Congress on Evolutionary Computation, vol. 2, pp. 1785–1791 (2005)

    Google Scholar 

  8. Brest, J., Greiner, S., Boskovic, B., Mernik, M., Zumer, V.: Self-adapting control parameters in differential evolution: A comparative study on numerical benchmark problems. IEEE Transactions on Evolutionary Computation 10(6), 646–657 (2006)

    Article  Google Scholar 

  9. Ilonen, J., Kamarainen, J., Lampinen, J.: Differential evolution training algorithm for feed-forward neural networks. Neural Processing Letters 17, 93–105 (2003)

    Article  Google Scholar 

  10. Zarth, A., Ludermir, T.: Optimization of neural networks weights and architecture: A multimodal methodology. In: International Conference on Intelligent Systems Design and Applications, pp. 209–214 (2009)

    Google Scholar 

  11. Ma, M., Xu, Y., Zhang, L.: A method of improving performance of fuzzy neural network based on differential evolution. In: 2008 International Conference on Machine Learning and Cybernetics, vol. 2, pp. 874–877 (2008)

    Google Scholar 

  12. Yao, X.: Evolving artificial neural networks. Proceedings of the IEEE 87(9), 1423–1447 (1999)

    Article  Google Scholar 

  13. Almeida, L., Ludermir, T.: A multi-objective memetic and hybrid methodology for optimizing the parameters and performance of artificial neural networks. Neurocomputing 73(7-9), 1438–1450 (2010)

    Article  Google Scholar 

  14. Cantu-Paz, E., Kamath, C.: An empirical comparison of combinations of evolutionary algorithms and neural networks for classification problems. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics 35(5), 915–927 (2005)

    Article  Google Scholar 

  15. Liu, Y., Yao, X.: Evolving neural network ensembles by fitness sharing. In: IEEE Congress on Evolutionary Computation, CEC 2006, pp. 3289–3293 (2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

da Silva, A.J., Mineu, N.L., Ludermir, T.B. (2010). Evolving Artificial Neural Networks Using Adaptive Differential Evolution. In: Kuri-Morales, A., Simari, G.R. (eds) Advances in Artificial Intelligence – IBERAMIA 2010. IBERAMIA 2010. Lecture Notes in Computer Science(), vol 6433. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-16952-6_40

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-16952-6_40

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-16951-9

  • Online ISBN: 978-3-642-16952-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics