Nothing Special   »   [go: up one dir, main page]

Skip to main content

Global Convergence Analysis of Decomposition Methods for Support Vector Regression

  • Conference paper
Advances in Neural Networks - ISNN 2008 (ISNN 2008)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5263))

Included in the following conference series:

Abstract

Decomposition method has been widely used to efficiently solve the large size quadratic programming (QP) problems arising in support vector regression (SVR). In a decomposition method, a large QP problem is decomposed into a series of smaller QP subproblems, which can be solved much faster than the original one. In this paper, we analyze the global convergence of decomposition methods for SVR. We will show the decomposition methods for the convex programming problem formulated by Flake and Lawrence always stop within a finite number of iterations.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)

    MATH  Google Scholar 

  2. Platt, J.C.: Fast Training of Support Vector Machines Using Sequential Minimal Optimization. In: Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods: Support Vector Machines, MIT Press, Cambridge (1998)

    Google Scholar 

  3. Joachims, T.: Making Large-scale SVM Learning Practical. In: Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods: Support Vector Machines, MIT Press, Cambridge (1998)

    Google Scholar 

  4. Keerthi, S.S., Shevade, S.K., Bhattacharyya, C.S.S., Murthy, K.R.K.: Improvements to Platt’s SMO Algorithm for SVM Classifier Design. Neural Computing 13, 637–649 (2001)

    Article  MATH  Google Scholar 

  5. Hsu, C.W., Lin, C.J.: A Simple Decomposition Method for Support Vector Machines. Machine Learning 46, 291–314 (2002)

    Article  MATH  Google Scholar 

  6. Takahashi, N., Nishi, T.: Global Convergence of Decomposition Learning Methods for Support Vector Machines. IEEE Trans. on Neural Networks 17, 1362–1368 (2006)

    Article  Google Scholar 

  7. Shevade, S.K., Keerthi, S.S., Bhattacharyya, C.S.S., Murthy, K.R.K.: Improvements to the SMO Algorithm for SVM Regression. IEEE Trans. on Neural Networks 11, 1183–1188 (2000)

    Article  Google Scholar 

  8. Laskov, P.: An Improved Decomposition Algorithm for Regression Support Vector Machines. In: Workshop on Support Vector Machines, NIPS 1999 (1999)

    Google Scholar 

  9. Lia, S.P., Lin, H.T., Lin, C.J.: A Note on the Decomposition Methods for Support Vector Regression. Neural Computing 14, 1267–1281 (2002)

    Article  Google Scholar 

  10. Flake, G.W., Lawrence, S.: Efficient SVM Regression Training with SMO. Machine Learning 46, 271–290 (2002)

    Article  MATH  Google Scholar 

  11. Zangwill, W.I.: Nonlinear Programming: A Unified Approach. Prentice-Hall, Englewood Cliffs (1967)

    Google Scholar 

  12. Luenberger, D.G.: Linear and Nonlinear Programming. Addison-Wesley, Reading (1989)

    Google Scholar 

  13. Guo, J., Takahashi, N., Nishi, T.: Convergence Proof of a Sequential Minimal Optimization Algorithm for Support Vector Regression. In: Proc. of IJCNN 2006, pp. 747–754 (2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Guo, J., Takahashi, N. (2008). Global Convergence Analysis of Decomposition Methods for Support Vector Regression. In: Sun, F., Zhang, J., Tan, Y., Cao, J., Yu, W. (eds) Advances in Neural Networks - ISNN 2008. ISNN 2008. Lecture Notes in Computer Science, vol 5263. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-87732-5_74

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-87732-5_74

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-87731-8

  • Online ISBN: 978-3-540-87732-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics