Abstract
For convex minimization we introduce an algorithm based on -space decomposition. The method uses a bundle subroutine to generate a sequence of approximate proximal points. When a primal-dual track leading to a solution and zero subgradient pair exists, these points approximate the primal track points and give the algorithm's , or corrector, steps. The subroutine also approximates dual track points that are -gradients needed for the method's -Newton predictor steps. With the inclusion of a simple line search the resulting algorithm is proved to be globally convergent. The convergence is superlinear if the primal-dual track points and the objective's -Hessian are approximated well enough.
Similar content being viewed by others
References
Auslender, A.: Numerical methods for nondifferentiable convex optimization. Math. Programming Stud. 30, 102–126 (1987)
Bonnans, J., Gilbert, J., Lemaréchal, C., Sagastizábal, C.: Numerical Optimization. Theoretical and Practical Aspects. Universitext. Springer-Verlag, Berlin, 2003, Xiv+423 pp.
Byrd, R., Nocedal, J., Schnabel, R.: Representations of quasi-Newton matrices and their use in limited-memory methods. Math. Program. 63, 129–156 (1994)
Chen, X., Fukushima, M.: Proximal quasi-Newton methods for nondifferentiable convex optimization. Math. Program. 85(2, Ser. A), 313–334 (1999)
Cheney, E., Goldstein, A.: Newton's method for convex programming and Tchebycheff approximations. Numerische Mathematik 1, 253–268 (1959)
Correa, R., Lemaréchal, C.: Convergence of some algorithms for convex minimization. Math. Program. 62(2), 261–275 (1993)
Fukushima, M.: A descent algorithm for nonsmooth convex optimization. Math. Program. 30, 163–175 (1984)
Fukushima, M., Qi, L.: A globally and superlinearly convergent algorithm for nonsmooth convex minimization. SIAM J. Optimization 6, 1106–1120 (1996)
Gilbert, J., Lemaréchal, C.: Some numerical experiments with variable-storage quasi-Newton algorithms. Math. Program. 45, 407–435 (1989)
Hare, W.: Nonsmooth Optimization with Smooth Substructure. Ph.D. thesis, Department of Mathematics, Simon Fraser University (2003). Preprint available at http://www.cecm.sfu.ca/~whare
Hiriart-Urruty, J.B., Lemaréchal, C.: Convex Analysis and Minimization Algorithms. No. 305-306 in Grund. der math.Wiss. Springer-Verlag, 1993. (two volumes)
Kelley, J.E.: The cutting plane method for solving convex programs. J. Soc. Indust. Appl. Math. 8, 703–712 (1960)
Kiwiel, K.: An aggregate subgradient method for nonsmooth convex minimization. Math. Program. 27, 320–341 (1983)
Kiwiel, K.: A method for solving certain quadratic programming problems arising in nonsmooth optimization. IMA Journal of Numerical Analysis 6, 137–152 (1986)
Kolda, T., O'Leary, D., Nazareth, J.: BFGS with update skipping and varying memory. SIAM J. Optimization 8, 1060–1083 (1998)
Lemaréchal, C., Mifflin, R.: Global and superlinear convergence of an algorithm for one-dimensional minimization of convex functions. Math. Program. 24, 241–256 (1982)
Lemaréchal, C., Oustry, F.: Growth conditions and -Lagrangians. Set-Valued Analysis 9(1/2), 123–129 (2001)
Lemaréchal, C., Oustry, F., Sagastizábal, C.: The -Lagrangian of a convex function. Trans. Am. Math. Soc. 352(2), 711–729 (2000)
Lemaréchal, C., Sagastizábal, C.: An approach to variable metric bundle methods. In: J. Henry, J.P. Yvon, (eds.), Systems Modeling and Optimization, no. 197 in Lecture Notes in Control and Information Sciences, Springer-Verlag, 1994, pp. 144–162
Lemaréchal, C., Sagastizábal, C.: More than first-order developments of convex functions: primal-dual relations. J. Convex Analysis 3(2), 1–14 (1996)
Lemaréchal, C., Sagastizábal, C.: Practical aspects of the Moreau-Yosida regularization: theoretical preliminaries. SIAM J. Optimization 7(2), 367–385 (1997)
Lemaréchal, C., Sagastizábal, C.: Variable metric bundle methods: from conceptual to implementable forms. Math. Program., Ser. A 76, 393–410 (1997)
Liu, D., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45, 503–520 (1989)
Lobo, M., Vandenberghe, L., Boyd, S., Lebret, H.: Applications of second-order cone programming. Linear Algebra and its Applications 284, 193–228 (1998)
Mifflin, R.: On superlinear convergence in univariate nonsmooth minimization. Math. Program. 49, 273–279 (1991)
Mifflin, R.: A quasi-second-order proximal bundle algorithm. Math. Program. 73(1), 51–72 (1996)
Mifflin, R., Sagastizábal, C.: -decomposition derivatives for convex max-functions. In: R. Tichatschke, M. Théra (eds.), Ill-posed Variational Problems and Regularization Techniques, no. 477 in Lecture Notes in Economics and Mathematical Systems, Springer-Verlag Berlin Heidelberg, 1999, pp. 167–186
Mifflin, R., Sagastizábal, C.: Functions with primal-dual gradient structure and -Hessians. In: G.D. Pillo, F. Giannessi, (eds.), Nonlinear Optimization and Related Topics, no. 36 in Applied Optimization, Kluwer Academic Publishers B.V. 2000, pp. 219–233
Mifflin, R., Sagastizábal, C.: On -theory for functions with primal-dual gradient structure. SIAM J. Optimization 11(2), 547–571 (2000)
Mifflin, R., Sagastizábal, C.: Proximal points are on the fast track. Journal of Convex Analysis 9(2), 563–579 (2002)
Mifflin, R., Sagastizábal, C.: Primal-Dual Gradient Structured Functions: second-order results; links to epi-derivatives and partly smooth functions. SIAM Journal on Optimization 13(4), 1174–1194 (2003)
Mifflin, R., Sagastizábal, C.: moothness and Proximal Point Results for Some Nonconvex Functions. Optimization Methods and Software 19(5), 463–478 (2004)
Mifflin, R., Sagastizábal, C.: Relating -Lagrangians to Second Order Epiderivatives and Proximal Tracks. Journal of Convex Analysis 12(1), 81–93 (2005). Http://www.heldermann.de/JCA/JCA12/jca12.htm
Mifflin, R., Sun, D., Qi, L.: Quasi-Newton bundle-type methods for nondifferentiable convex optimization. SIAM Journal on Optimization 8 (2), 583–603 (1998)
Miller, S.A., Malick, J.: Connections among -Lagrangian, Riemannian Newton, and SQP Methods. Rapport de Recherche 5185, INRIA (2004)
Moreau, J.: Proximité et dualité dans un espace Hilbertien. Bulletin de la Société Mathématique de France 93, 273–299 (1965)
Oustry, F.: A second-order bundle method to minimize the maximum eigenvalue function. Math. Program. 89(1, Ser. A), 1–33 (2000)
Qi, L., Chen, X.: A preconditioning proximal Newton method for nondifferentiable convex optimization. Math. Program. 76(3, Ser. B), 411–429 (1997)
Rauf, A., Fukushima, M.: Globally convergent BFGS method for nonsmooth convex optimization. J. Optim. Theory Appl. 104(3), 539–558 (2000)
Rockafellar, R.: Monotone operators and the proximal point algorithm. SIAM Journal on Control and Optimization 14, 877–898 (1976)
Rockafellar, R., Wets, R.B.: Variational Analysis. No. 317 in Grund. der math. Wiss. Springer-Verlag, 1998
Author information
Authors and Affiliations
Corresponding author
Additional information
Dedicated to Terry Rockafellar who has had a great influence on our work via strong support for proximal points and for structural definitions that involve tangential convergence.
On leave from INRIA Rocquencourt
Research of the first author supported by the National Science Foundation under Grant No. DMS-0071459 and by CNPq (Brazil) under Grant No. 452966/2003-5. Research of the second author supported by FAPERJ (Brazil) under Grant No.E26/150.581/00 and by CNPq (Brazil) under Grant No. 383066/2004-2.
Rights and permissions
About this article
Cite this article
Mifflin, R., Sagastizábal, C. A -algorithm for convex minimization. Math. Program. 104, 583–608 (2005). https://doi.org/10.1007/s10107-005-0630-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10107-005-0630-3