Abstract
Due to the existence of redundant features, the Radial-Basis Function Neural Network (RBFNN) which is trained from a dataset is likely to be huge. Sensitivity analysis technique usually could help to reduce the features by deleting insensitive features. Considering the perturbation of network output as a random variable, this paper defines a new sensitivity formula which is the limit of variance of output perturbation with respect to the input perturbation going to zero. To simplify the sensitivity expression and computation, we prove that the exchange between limit and variance is valid. A formula for computing the new sensitivity of individual features is derived. Numerical simulations show that the new sensitivity definition can be used to remove irrelevant features effectively.
This research work is supported by NSFC (60473045) and Natural Science Foundation of Hebei Province (603137).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Haykin, S.: Neural Networks: A Comprehensive Foundation (Second Edition). Prentice-Hall, Incorporation, Englewood Cliffs (2001)
Winter, R.: Madaline Rule II: A New Method for Training Networks of Adalines. Ph. Thesis, Stanford University, Stanford, CA (1989)
Zurada, J.M.: Perturbation Method for Deleting Redundant Inputs of Perceptron Networks. Neurocomputingc 14, 177–193 (1997)
Choi, J.Y., Choi, C.: Sensitivity Analysis of Multilayer Perceptron with Differentiable Activation Functions. IEEE Transactions on Neural Networks 3, 101–107 (1992)
Zeng, X., Yeung, D.S.: Sensitivity Analysis of Multilayer Perceptron to Input and Weight Perturbations. IEEE Transactions on Neural Networks 12, 1358–1366 (2001)
Stephen, W., Piché: The Selection of Weight Accuracies for Madalines. IEEE Transactions on Neural Networks 6, 432–445 (1995)
Karayiannis, N.B.: Reformulated Radial Basis Neural Networks Trained by Gradient Decent. IEEE Transactions on Neural Networks 10(3), 657–671 (1999)
Karayiannis, N.B.: New Development in the Theory and Training of Reformulated Radial Basis Neural Networks. In: IEEE International Joint Conference on Neural Network, pp. 614–619 (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Wang, X., Li, C. (2005). A New Definition of Sensitivity for RBFNN and Its Applications to Feature Reduction. In: Wang, J., Liao, X., Yi, Z. (eds) Advances in Neural Networks – ISNN 2005. ISNN 2005. Lecture Notes in Computer Science, vol 3496. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11427391_12
Download citation
DOI: https://doi.org/10.1007/11427391_12
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-25912-1
Online ISBN: 978-3-540-32065-4
eBook Packages: Computer ScienceComputer Science (R0)