Abstract
The paper describes a new evolutionary system for evolving artificial neural networks (ANN’s) called PSONN, which is based on the particle swarm optimisation (PSO) algorithm. The PSO algorithm is used to evolve both the architecture and weights of ANN’s, this means that an ANN’s architecture is adaptively adjusted by PSO algorithm, then the nodes of this ANN’s are also evolved by PSO algorithm to evaluate the quality of this network architecture. This process is repeated until the best ANN’s is accepted or the maximum number of generations has been reached. In PSONN, a strategy of evolving added nodes and a partial training algorithm are used to maintain a close behavioural link between the parents and their offspring, which improves the efficiency of evolving ANN’s. PSONN has been tested on two real problems in the medical domain. The results show that ANN’s evolved by PSONN have good accuracy and generalisation ability.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
N. Burgess. A constructive algorithm that converges for real-valued input patterns. Int. J. Neural Syst., vol 5, no. 1, pp. 59–66, 1994.
R. Reed. Pruning algorithms-A survey, IEEE trans. Neural Networks, vol 4, pp. 740–747, 1995.
D. B. Fogel. Evolutionary computation: toward a new philosophy of machine intelligence. New York: IEEE Press, 1995.
G. F. Miller, P. M. Todd, and S. U. Hegde. Designing neural networks using genetic algorithms. In proc. 3rd Int. Conf. Genetic Algorithms Their Applications, CA: Morgan Kaufmann, 1989, pp. 379–384.
R. K. Belew, J. MchInerney and N. N. Schraudolph. Evolving networks: Using GAs with connectionist learning. Computer Sci. Eng. Dept., Univ. California-San Diego, Tech. Rep. CS90-174 revised, Feb. 1991.
Kennedy, J., and Eberhart, R. C. Particle swarm optimization. Proc. IEEE International Conference on Neural Networks, IEEE Service Center, Piscataway, NJ, pp. 39–43, 1995.
X. Yao. A review of evolutionary artificial neural networks. Int. J. Intell. Syst., vol. 8, no.4, pp. 539–567, 1993.
J. R. McDonnell and D. Waagen. Evolving recurrent perceptrons for time-series modeling. IEEE Tran. Neural Networks, vol. 5, no. 1, pp. 24–38, 1994.
V. Maniezzo. Genetic evolution of the topology and weight distribution of neural networks. IEEE Trans. Neural Networks, vol. 5, no. 1, pp. 39–53, 1994.
Fahlman, S.E., and Lebiere, C. The cascade-correlation learning architecture. In D.S. Touretzky. Advances in neural information processing systems 2, pp. 524–532, San Mateo, CA: Morgan Kaufmann, 1990.
P.J. Angeline, G.M. Sauders, and J.B. Pollack. An evolutionary algorithm that constructs recurrent neural networks. IEEE Trans. Neural Networks, vol. 5, pp. 54–65. 1994.
L. Prechelt. Some notes on neural learning algorithm benchmarking. Neurocomputing, vol. 9, no. 3, pp. 343–347. 1995.
R. Setiono and L. C. K. Hui. Use of a quasinewton method in a feedforward neural network construction algorithm. IEEE Trans. Neural Network, vol. 6, pp. 740–747, 1995.
K. P. Bennett and O. L. Mangasarian. Robust linear programming discrimination of two linearly inseparable sets. Optimization Methods Software, vol. 1, pp. 23–34, 1992.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag London
About this paper
Cite this paper
Zhang, C., Shao, H. (2000). Particle Swarm Optimisation in Feedforward Neural Network. In: Malmgren, H., Borga, M., Niklasson, L. (eds) Artificial Neural Networks in Medicine and Biology. Perspectives in Neural Computing. Springer, London. https://doi.org/10.1007/978-1-4471-0513-8_50
Download citation
DOI: https://doi.org/10.1007/978-1-4471-0513-8_50
Publisher Name: Springer, London
Print ISBN: 978-1-85233-289-1
Online ISBN: 978-1-4471-0513-8
eBook Packages: Springer Book Archive