Abstract
Decomposition method has been widely used to efficiently solve the large size quadratic programming (QP) problems arising in support vector regression (SVR). In a decomposition method, a large QP problem is decomposed into a series of smaller QP subproblems, which can be solved much faster than the original one. In this paper, we analyze the global convergence of decomposition methods for SVR. We will show the decomposition methods for the convex programming problem formulated by Flake and Lawrence always stop within a finite number of iterations.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)
Platt, J.C.: Fast Training of Support Vector Machines Using Sequential Minimal Optimization. In: Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods: Support Vector Machines, MIT Press, Cambridge (1998)
Joachims, T.: Making Large-scale SVM Learning Practical. In: Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods: Support Vector Machines, MIT Press, Cambridge (1998)
Keerthi, S.S., Shevade, S.K., Bhattacharyya, C.S.S., Murthy, K.R.K.: Improvements to Platt’s SMO Algorithm for SVM Classifier Design. Neural Computing 13, 637–649 (2001)
Hsu, C.W., Lin, C.J.: A Simple Decomposition Method for Support Vector Machines. Machine Learning 46, 291–314 (2002)
Takahashi, N., Nishi, T.: Global Convergence of Decomposition Learning Methods for Support Vector Machines. IEEE Trans. on Neural Networks 17, 1362–1368 (2006)
Shevade, S.K., Keerthi, S.S., Bhattacharyya, C.S.S., Murthy, K.R.K.: Improvements to the SMO Algorithm for SVM Regression. IEEE Trans. on Neural Networks 11, 1183–1188 (2000)
Laskov, P.: An Improved Decomposition Algorithm for Regression Support Vector Machines. In: Workshop on Support Vector Machines, NIPS 1999 (1999)
Lia, S.P., Lin, H.T., Lin, C.J.: A Note on the Decomposition Methods for Support Vector Regression. Neural Computing 14, 1267–1281 (2002)
Flake, G.W., Lawrence, S.: Efficient SVM Regression Training with SMO. Machine Learning 46, 271–290 (2002)
Zangwill, W.I.: Nonlinear Programming: A Unified Approach. Prentice-Hall, Englewood Cliffs (1967)
Luenberger, D.G.: Linear and Nonlinear Programming. Addison-Wesley, Reading (1989)
Guo, J., Takahashi, N., Nishi, T.: Convergence Proof of a Sequential Minimal Optimization Algorithm for Support Vector Regression. In: Proc. of IJCNN 2006, pp. 747–754 (2006)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Guo, J., Takahashi, N. (2008). Global Convergence Analysis of Decomposition Methods for Support Vector Regression. In: Sun, F., Zhang, J., Tan, Y., Cao, J., Yu, W. (eds) Advances in Neural Networks - ISNN 2008. ISNN 2008. Lecture Notes in Computer Science, vol 5263. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-87732-5_74
Download citation
DOI: https://doi.org/10.1007/978-3-540-87732-5_74
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-87731-8
Online ISBN: 978-3-540-87732-5
eBook Packages: Computer ScienceComputer Science (R0)