Abstract
One of the main problems in the training of artificial neural networks is to define their initial weights and architecture. The use of evolutionary algorithms (EAs) to optimize artificial neural networks has been largely used because the EAs can deal with large, non-differentiable, complex and multimodal spaces, and because they are good in finding the optimal region. In this paper we propose the use of Adaptive Differential Evolution (JADE), a new evolutionary algorithm based in the differential evolution (DE), to deal with this problem of training neural networks. Experiments were performed to evaluate the proposed method using machine learning benchmarks for classification problems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Storn, R., Price, K.: Differential evolution- a simple and efficient adaptive scheme for global optimization over continuous spaces. Technical Report TR-95-012, International Computer Science Institute (March 1995)
Cai, H.R., Chung, C.Y., Wong, K.P.: Application of differential evolution algorithm for transient stability constrained optimal power flow. IEEE Transactions on Power Systems 23(2), 719–728 (2008)
Zhang, J., Sanderson, A.C.: Jade: Adaptive differential evolution with optional external archive. IEEE Transactions on Evolutionary Computation 13(5), 945–958 (2009)
Kim, H., Chong, J., Park, K., Lowther, D.A.: Differential evolution strategy for constrained global optimization and application to practical engineering problems. IEEE Transactions on Magnetics 43(4), 1565–1568 (2007)
Das, S., Abraham, A., Konar, A.: Automatic clustering using an improved differential evolution algorithm. IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans 38(1), 218–237 (2008)
Gämperle, R., Müller, S.D., Koumoutsakos, P.: A parameter study for differential evolution. In: Proc. Advances Intell. Syst., Fuzzy Syst., Evol. Comput., pp. 293–298 (2002)
Qin, A.K., Suganthan, P.N.: Self-adaptive differential evolution algorithm for numerical optimization. In: The 2005 IEEE Congress on Evolutionary Computation, vol. 2, pp. 1785–1791 (2005)
Brest, J., Greiner, S., Boskovic, B., Mernik, M., Zumer, V.: Self-adapting control parameters in differential evolution: A comparative study on numerical benchmark problems. IEEE Transactions on Evolutionary Computation 10(6), 646–657 (2006)
Ilonen, J., Kamarainen, J., Lampinen, J.: Differential evolution training algorithm for feed-forward neural networks. Neural Processing Letters 17, 93–105 (2003)
Zarth, A., Ludermir, T.: Optimization of neural networks weights and architecture: A multimodal methodology. In: International Conference on Intelligent Systems Design and Applications, pp. 209–214 (2009)
Ma, M., Xu, Y., Zhang, L.: A method of improving performance of fuzzy neural network based on differential evolution. In: 2008 International Conference on Machine Learning and Cybernetics, vol. 2, pp. 874–877 (2008)
Yao, X.: Evolving artificial neural networks. Proceedings of the IEEE 87(9), 1423–1447 (1999)
Almeida, L., Ludermir, T.: A multi-objective memetic and hybrid methodology for optimizing the parameters and performance of artificial neural networks. Neurocomputing 73(7-9), 1438–1450 (2010)
Cantu-Paz, E., Kamath, C.: An empirical comparison of combinations of evolutionary algorithms and neural networks for classification problems. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics 35(5), 915–927 (2005)
Liu, Y., Yao, X.: Evolving neural network ensembles by fitness sharing. In: IEEE Congress on Evolutionary Computation, CEC 2006, pp. 3289–3293 (2006)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
da Silva, A.J., Mineu, N.L., Ludermir, T.B. (2010). Evolving Artificial Neural Networks Using Adaptive Differential Evolution. In: Kuri-Morales, A., Simari, G.R. (eds) Advances in Artificial Intelligence – IBERAMIA 2010. IBERAMIA 2010. Lecture Notes in Computer Science(), vol 6433. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-16952-6_40
Download citation
DOI: https://doi.org/10.1007/978-3-642-16952-6_40
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-16951-9
Online ISBN: 978-3-642-16952-6
eBook Packages: Computer ScienceComputer Science (R0)