Abstract
This paper presents an algorithm for approximately minimizing a convex function in simple, not necessarily bounded convex, finite-dimensional domains, assuming only that function values and subgradients are available. No global information about the objective function is needed apart from a strong convexity parameter (which can be put to zero if only convexity is known). The worst case number of iterations needed to achieve a given accuracy is independent of the dimension and—apart from a constant factor—best possible under a variety of smoothness assumptions on the objective function.
Similar content being viewed by others
References
Ahookhosh, M.: Optimal subgradient algorithms with application to large-scale linear inverse problems, Submitted. http://arxiv.org/abs/1402.7291 (2014)
Ahookhosh, M., Neumaier, A.: High-dimensional convex optimization via optimal affine subgradient algorithms. In: ROKS workshop, 83–84 (2013)
Ahookhosh, M., Neumaier, A.: An optimal subgradient algorithm with subspace search for costly convex optimization problems. Submitted. http://www.optimization-online.org/DB_FILE/2015/04/4852 (2015)
Ahookhosh, M., Neumaier, A.: Solving nonsmooth convex optimization with complexity \(O(\varepsilon ^{-1/2})\). Submitted. http://www.optimizationonline.org/DB_HTML/2015/05/4900.html (2015)
Ahookhosh, M., Neumaier, A.: An optimal subgradient algorithms for large-scale bound-constrained convex optimization. Submitted. http://arxiv.org/abs/1501.01497 (2015)
Ahookhosh, M., Neumaier, A.: An optimal subgradient algorithms for large-scale convex optimization in simple domains. Submitted. http://arxiv.org/abs/1501.01451 (2015)
Auslender, A., Teboulle, M.: Interior gradient and proximal methods for convex and conic optimization. SIAM J. Optim. 16, 697–725 (2006)
Axelsson, O., Lindskog, G.: On the rate of convergence of the conjugate gradient method. Numer. Math. 48, 499–523 (1986)
Aybat, N.S., Iyengar, G.: A first-order augmented Lagrangian method for compressed sensing. SIAM J. Optim. 22(2), 429–459 (2012)
Beck, A., Ben-Tal, A., Guttmann-Beck, N., Tetruashvili, L.: The CoMirror algorithm for solving nonsmooth constrained convex problems. Oper. Res. Lett. 38, 493–498 (2010)
Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J. Imaging Sci. 2, 183–202 (2009)
Becker, S.R., Candès, E.J., Grant, M.C.: Templates for convex cone problems with applications to sparse signal recovery. Math. Program. Comput. 3, 165–218 (2011)
Chen, J., Burer, S.: A first-order smoothing technique for a class of large-scale linear programs. SIAM J. Optim. 24, 598–620 (2014)
Devolder, O., Glineur, F., Nesterov, Y.: First-order methods of smooth convex optimization with inexact oracle. Math. Program. 146, 37–75 (2014)
Fountoulakis, K., Gondzio, J., Zhlobich, P.: Matrix-free interior point method for compressed sensing problems. Math. Program. Comput. 6, 1–31 (2014)
Gonzaga, C.C., Karas, E.W.: Fine tuning Nesterov’s steepest descent algorithm for differentiable convex programming. Math. Program. 138, 141–166 (2013)
Gonzaga, C.C., Karas, E.W., Rossetto, D.R.: An optimal algorithm for constrained differentiable convex optimization. SIAM J. Optim. 23, 1939–1955 (2013)
Gu, M., Lim, L.-H., Wu, C.J.: PARNES: a rapidly convergent algorithm for accurate recovery of sparse and approximately sparse signals. Numer. Algorithm. 64, 321–347 (2013)
Juditsky, A., Nesterov, Y.: Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization. Stoch. Syst. 4(1), 44–80 (2014)
Lan, G.: Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization. Mathematical Programming (2013). doi:10.1007/s10107-013-0737-x
Lan, G., Lu, Z., Monteiro, R.D.C.: Primal-dual first-order methods with \(O(1/\varepsilon )\) iteration-complexity for cone programming. Math. Program. 126, 1–29 (2011)
Meng, X., Chen, H.: Accelerating Nesterov’s method for strongly convex functions with Lipschitz gradient, Arxiv preprint arXiv:1109.6058 (2011)
Nemirovsky, A.S., Yudin, D.B.: Problem Complexity and Method Efficiency in Optimization. Wiley, New York (1983)
Nesterov, Y.: A method of solving a convex programming problem with convergence rate \(O(1, k^2)\) (in Russian), Doklady AN SSSR 269 (1983), 543–547. Engl. translation: Soviet Math. Dokl. 27(1983), 372–376
Nesterov, Y.: Introductory Lectures on Convex Optimization: A Basic Course. Kluwer, Dordrecht (2004)
Nesterov, Y.: Smooth minimization of non-smooth functions. Math. Program. 103, 127–152 (2005)
Nesterov, Y.: Rounding of convex sets and efficient gradient methods for linear programming problems. Optim. Method. Softw. 23, 109–128 (2008)
Nesterov, Y.: Unconstrained convex minimization in relative scale. Math. Oper. Res. 34, 180–193 (2009)
Nesterov, Y.: Primal-dual subgradient methods for convex problems. Math. Program. 120, 221–259 (2009)
Nesterov, Y.: Gradient methods for minimizing composite objective function. Math. Program. 140, 125–161 (2013)
Nesterov, Y.: Universal gradient methods for convex optimization problems. Math. Programming (2014). doi:10.1007/s10107-014-0790-0
Richtarik, P.: Improved algorithms for convex minimization in relative scale. SIAM J. Optim. 21, 1141–1167 (2011)
Tseng, P.: On accelerated proximal gradient methods for convex-concave optimization, Technical report, Math. Dept., Univ. of Washington. http://pages.cs.wisc.edu/~brecht/cs726docs/Tseng.APG (2008)
Yu, J., Vishvanathan, S.V.N., Günter, S., Schraudolph, N.N.: A Quasi–Newton approach to nonsmooth convex optimization problems in machine learning. J. Mach. Learn. Res. 11, 1145–1200 (2010)
Acknowledgments
I’d like to thank Masoud Ahookhosh for numerous useful remarks on earlier versions of the manuscript. Thanks also to the referees for a number of suggestions that improved the paper.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Neumaier, A. OSGA: a fast subgradient algorithm with optimal complexity. Math. Program. 158, 1–21 (2016). https://doi.org/10.1007/s10107-015-0911-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10107-015-0911-4
Keywords
- Complexity bound
- Convex optimization
- Optimal subgradient method
- Large-scale optimization
- Nesterov’s optimal method
- Nonsmooth optimization
- Optimal first-order method
- Smooth optimization
- Strongly convex