Abstract
In this work we define a block decomposition Jacobi-type method for nonlinear optimization problems with one linear constraint and bound constraints on the variables. We prove convergence of the method to stationary points of the problem under quite general assumptions.
Similar content being viewed by others
References
Bertsekas D.P., Tsitsiklis J.N.: Parallel and distributed computation. Prentice-Hall, Englewood Cliffs (1989)
Bomze I.M.: Evolution towards the maximum clique. J. Global Optim. 10, 143–164 (1997)
Cassioli A., Sciandrone M.: A convergent decomposition method for box-constrained optimization problems. Optim. Lett. 3, 397–409 (2009)
Dai Y.-H., Fletcher R.: New algorithms for singly linearly constrained quadratic programs subject to lower and upper bounds. Math. Program. 106(3), 403–421 (2006)
Ferris M.C., Mangasarian O.L.: Parallel variable distribution. SIAM J. Optim. 4, 815–832 (1994)
Glasmachers T., Igel C.: Maximum-gain working set selection for SVMs. J. Mach. Learn. Res. 7, 1437–1466 (2006)
Grippo L., Sciandrone M.: Globally convergent block-coordinate techniques for unconstrained optimization. Optim. Methods Softw. 10(4), 587–637 (1999)
Grippo L., Sciandrone M.: On the convergence of the block nonlinear Gauss–Seidel method under convex constraints. Oper. Res. Lett. 26(3), 127–136 (2000)
Joachims T.: Making large scale SVM learning practical. In: Schölkopf, C.B.B., Burges, C.J.C., Smola, A. (eds) Advances in Kernel Methods—Support Vector Learning, MIT Press, Cambridge (1998)
Keerthi S.S., Gilbert E.G.: Convergence of a generalized SMO algorithm for SVM classifier design. Mach. Learn. 46, 351–360 (2002)
Lin C.-J.: On the convergence of the decomposition method for Support Vector Machines. IEEE Trans. Neural Netw. 12, 1288–1298 (2001)
Lin C.-J.: Asymptotic convergence of an SMO algorithm without any assumptions. IEEE Trans. Neural Netw. 13, 248–250 (2002)
Lin C.-J., Lucidi S., Palagi L., Risi A., Sciandrone M.: A decomposition algorithm model for singly linearly constrained problems subject to lower and upper bounds. J. Optim. Theory Appl. 141(1), 107–126 (2009)
Lucidi S., Palagi L., Risi A., Sciandrone M.: A convergent decomposition algorithm for support vector machines. Comput. Optim. Appl. 38, 217–234 (2007)
Lucidi S., Palagi L., Risi A., Sciandrone M.: A convergent hybrid decomposition algorithm model for SVM training. IEEE Trans. Neural Netw. 20(6), 1055–1060 (2009)
Markowitz H.: Portfolio selection. J. Finance 7(1), 77–91 (1952)
Motzkin T.S., Strauss E.G.: Maxima for graphs and a new proof of a theorem of Turan. Can. J. Math. 17, 533–540 (1965)
Palagi L., Sciandrone M.: On the convergence of a modified version of SVM light algorithm. Optim. Methods Softw. 20(2–3), 311–328 (2005)
Serafini T., Zanghirati G., Zanni L.: Gradient projection methods for large quadratic programs and applications in training support vector machines. Optim. Methods Softw. 20, 353–378 (2005)
Vapnik V.N.: The Nature of Statistical Learning Theory. Springer, New York (1995)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Liuzzi, G., Palagi, L. & Piacentini, M. On the convergence of a Jacobi-type algorithm for singly linearly-constrained problems subject to simple bounds. Optim Lett 5, 347–362 (2011). https://doi.org/10.1007/s11590-010-0214-x
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-010-0214-x