Abstract
Since feature selection can remove the irrelevant features and improve the performance of learning systems, it is an crucial step in machine learning. The feature selection methods using support vector machines have obtained satisfactory results, but the previous works are usually for binary classification, and needs auxiliary techniques to be extended to multiple classification. In this paper, we propose a prediction risk based feature selection method using multiple classification support vector machines. The performance of the proposed method is compared with the previous methods of optimal brain damage based feature selection methods using binary support vector machines. The results of experiments on UCI data sets show that prediction risk based feature selection method obtains better results than the previous methods using support vector machines for multiple classification problems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Dash, M., Liu, H.: Feature selection for classification. Intelligent Data Analysis 1, 131–156 (1997)
Kohavi, R., George, J.H.: Wrappers for feature subset selection. Artificial Intelligence 97, 273–324 (1997)
Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. Journal of machine learning research 3, 1157–1182 (2003)
Reed, R.: Pruning algorithms — a survey. IEEE Transactions on Neural Networks 4, 740–747 (1993)
Guyon, I., Weston, J., Barnhill, S., Vapnik, V.: Gene selection for cancer classification using support vector machines. Machine Learning 46, 389–422 (2002)
Haykin, S.: Neural Networks: A Comprehensive Foundation, 2nd edn. Printice Hall, New Jersey (1999)
Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines. Cambridge University Press, Cambridge (2000)
Vapnik, V.: Statistical Learning Theory. Wiley, New York (1998)
Weston, J., Mukherjee, S., Chapelle, O., Pontil, M., Poggio, T., Vapnik, V.: Feature selection for SVMs. In: Advances in Neural Information Processing Systems, vol. 13 (2001)
Rakotomamonjy, A.: Variable selection using SVM-based criteria. Journal of machine learning research 3, 1357–1370 (2003)
Weston, J., Elisseeff, A., Bakir, G., Sinz, F.: The spider (2004), http://www.kyb.tuebingen.mpg.de/bs/people/spider/index.html
Moody, J., Utans, J.: Principled architecture selection for neural networks: Application to corporate bond rating prediction. In: Moody, J.E., Hanson, S.J., Lippmann, R.P. (eds.) Advances in Neural Information Processing Systems, vol. 4, pp. 683–690. Morgan Kaufmann Publishers, Inc., San Francisco (1992)
LeCun, Y., Jackel, L.D., Bottou, L., Brunot, A., Cortes, C., Denker, J.S., Drucker, H., Guyon, I., Müller, U.A., Säckinger, E., Simard, P., Vapnik, V.: Comparison of learning algorithms for handwritten digit recognition. In: Fogelman-Soulié, F., Gallinari, P. (eds.) Proceedings ICANN 1995 – International Conference on Artificial Neural Networks, vol. II, pp. 53–60 (1995)
Joachims, T.: Text categorization with support vector machines. In: Nédellec, C., Rouveirol, C. (eds.) ECML 1998. LNCS, vol. 1398, Springer, Heidelberg (1998)
Pontil, M., Verri, A.: Object recognition with support vector machines. IEEE Trans. on PAMI 20, 637–646 (1998)
El-Naqa, I., Yang, Y., Wernick, M.N., Galatsanos, N.P., R, N.: Support vector machine learning for detection of microcalcifications in mammograms. In: Proceedings of IEEE International Symposium on Biomedical Imaging, pp. 201–204 (2002)
Mercer, J.: Functions of positive and negative type and their connection with the theory of integral equations. Philos. Trans. Roy. Soc. London A 209, 415–446 (1909)
Keerthi, S.S., Lin, C.J.: Asymptotic behaviors of support vector machines with gaussian kernel. Neural Computation 15, 1667–1689 (2003)
Hsu, C.W., Lin, C.J.: A comparison of methods for multi-class support vector machines. IEEE Transactions on Neural Networks 13, 415–425 (2002)
LeCun, Y., Denker, J.S., Solla, S.A.: Optimal brain damage. In: Touretzky, D. (ed.) Advances in Neural Information Processing Systems, pp. 598–605. Morgan Kaufmann, Inc., San Francisco (1990)
Cibas, T., Soulie, F., Gallinari, P.: Variable selection with neural networks. Neurocomputing 12, 223–248 (1996)
Verikas, A., Bacauskiene, M.: Feature selection with neural networks. Pattern Recognition Letters 23, 1323–1335 (2002)
Marill, T., Green, D.M.: On the effectiveness of receptors in recognition system. IEEE Transaction on Information Theory 9, 11–17 (1963)
Blake, C., Keogh, E., Merz, C.J.: UCI repository of machine learning databases. Technical report, Department of Information and Computer Science, University of California, Irvine, CA (1998), http://www.ics.uci.edu/mlearn/MLRepository.htm
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Li, GZ., Yang, J., Liu, GP., Xue, L. (2004). Feature Selection for Multi-class Problems Using Support Vector Machines. In: Zhang, C., W. Guesgen, H., Yeap, WK. (eds) PRICAI 2004: Trends in Artificial Intelligence. PRICAI 2004. Lecture Notes in Computer Science(), vol 3157. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-28633-2_32
Download citation
DOI: https://doi.org/10.1007/978-3-540-28633-2_32
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-22817-2
Online ISBN: 978-3-540-28633-2
eBook Packages: Springer Book Archive