Abstract
In order to handle large-scale pattern classification problems, various sequential and parallel classification methods have been developed according to the divide-and-conquer principle. However, existing sequential methods need long training time, and some of parallel methods lead to generalization accuracy decreasing and the number of support vectors increasing. In this paper, we propose a novel hierarchical and parallel method for training support vector machines. The simulation results indicate that our method can not only speed up training but also reduce the number of support vectors while maintaining the generalization accuracy.
This work was supported by the National Natural Science Foundation of China under the grants NSFC 60375022 and NSFC 60473040. This work was also supported in part by Open Fund of Grid Computing Center, Shanghai Jiao Tong University.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Vapnik, V.N.: Statistical Learning Theory. Wiley Interscience, Hoboken (1998)
Osuna, E., Freund, R., Girosi, F.: An Improved Training Algorithm for Support Vector Machines. In: Proceedings of IEEE NNSP 1997, pp. 276–285 (1997)
Joachims, T.: Making Large-scale Support Vector Machine Learning Pratical. In: Schölkopf, B., Burges, C.J., Smola, A.J. (eds.) Advances in Kernel Methods- Support Vector Learning, pp. 169–184. MIT Press, Cambridge (2000)
Lu, B.L., Ito, M.: Task Decomposition and Module Combination Based on Class Relations: A Modular Neural Network for Pattern Classification. IEEE Transaction on Neural Networks 10, 1244–1256 (1999)
Lu, B.L., Wang, K.A., Utiyama, M., Isahara, H.: A Part-versus-part Method for Massively Parallel Training of Support Vector Machines. In: Proceedings of IJCNN 2004, Budapest, Hungary, pp. 735–740 (2004)
Schwaighofer, A., Tresp, V.: The Bayesian Committee Support Vector Machine. In: Dorffner, G., Bischof, H., Hornik, K. (eds.) ICANN 2001. LNCS, vol. 2130, pp. 411–417. Springer, Heidelberg (2001)
Schölkopf, B., Burges, C., Vapnik, V.N.: Extracting Support Data for a Given Task. In: Proceedings of the First International Conference on Knowledge Discovery & Data Mining, Menlo Park, CA, pp. 252–257 (1995)
Syed, N.A., Liu, H., Sung, K.K.: Incremental Learning with Support Vector Machines. In: Proceedings of the Workshop on Support Vector Machines at the International Joint Conference on Artificial Intelligence, Stockholm, Sweden (1999)
Wen, Y.M., Lu, B.L.: A Cascade Method for Reducing Training Time and the Number of Support Vectors. In: Yin, F.-L., Wang, J., Guo, C. (eds.) ISNN 2004. LNCS, vol. 3173, pp. 480–485. Springer, Heidelberg (2004)
Blake, C.L., Merz, C. J.: UCI (1998), ftp://ftp.ics.uci.edu/pub/machine-learningdatabases
Ke, H.X., Zhang, X.G.: Editing Support Vector Machines. In: Proceedings of IJCNN 2001, Washington, USA, pp. 1464–1467 (2001)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Wen, Y., Lu, B. (2005). A Hierarchical and Parallel Method for Training Support Vector Machines. In: Wang, J., Liao, X., Yi, Z. (eds) Advances in Neural Networks – ISNN 2005. ISNN 2005. Lecture Notes in Computer Science, vol 3496. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11427391_141
Download citation
DOI: https://doi.org/10.1007/11427391_141
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-25912-1
Online ISBN: 978-3-540-32065-4
eBook Packages: Computer ScienceComputer Science (R0)