Abstract
Successive quadratic approximations (SQA) are numerically efficient for minimizing the sum of a smooth function and a convex function. The iteration complexity of inexact SQA methods has been analyzed recently. In this paper, we present an algorithmic framework of inexact SQA methods with four types of line searches, and analyze its global complexity under milder assumptions. First, we show its well-definedness and some decreasing properties. Second, under the quadratic growth condition and a uniform positive lower bound condition on stepsizes, we show that the function value sequence and the iterate sequence are linearly convergent. Moreover, we obtain a o(1/k) complexity without the quadratic growth condition, improving existing \({\mathcal {O}}(1/k)\) complexity results. At last, we show that a local gradient-Lipschitz-continuity condition could guarantee a uniform positive lower bound for the stepsizes.
Similar content being viewed by others
References
Bauschke, H.H., Bolte, J., Teboulle, M.: A descent lemma beyond lipschitz gradient continuity: first-order methods revisited and applications. Math. Oper. Res. 42(2), 330–348 (2016)
Bauschke, H.H., Combettes, P.L.: Convex analysis and monotone operator theory in Hilbert spaces. In: Dilcher, K., Taylor, K. (eds.) CMS Books in Mathematics/Ouvrages de Mathématiques de la SMC, 2nd edn. Springer, Cham (2017). With a foreword by Hédy Attouch
Beck, A.: First-order Methods in Optimization, vol. 25. SIAM, Philadelphia (2017)
Bello Cruz, J.Y., de Oliveira, W.: On weak and strong convergence of the projected gradient method for convex optimization in real Hilbert spaces. Numer. Funct. Anal. Optim. 37(2), 129–144 (2016)
Bello Cruz, J.Y., Li, G., Nghia, T.T.A.: On the q-linear convergence of forward-backward splitting method and uniqueness of optimal solution to lasso. arXiv preprint arXiv:1806.06333, (2018)
Bello Cruz, J.Y., Nghia, T.T.A.: On the convergence of the forward–backward splitting method with linesearches. Optim. Methods Softw. 31(6), 1209–1238 (2016)
Byrd, R.H., Nocedal, J., Oztoprak, F.: An inexact successive quadratic approximation method for \(\ell _1\) regularized optimization. Math. Program. 157(2, Ser. B), 375–396 (2016)
Cegielski, A.: Iterative Methods for Fixed Point Problems in Hilbert Spaces, Lecture Notes in Mathematics, vol. 2057. Springer, Heidelberg (2012)
Combettes, P.L., Wajs, V.R.: Signal recovery by proximal forward–backward splitting. Multiscale Model. Simul. 4(4), 1168–1200 (2005)
Hsieh, C.-J, Dhillon, I.S., Ravikumar, P.K., Sustik, M.A.: Sparse inverse covariance matrix estimation using quadratic approximation. In: Advances in Neural Information Processing Systems, pp. 2330–2338 (2011)
Lee, C., Wright, S.J.: Inexact Successive quadratic approximation for regularized optimization. Comput. Optim. Appl. 72, 641–674 (2019)
Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM J. Optim. 24(3), 1420–1443 (2014)
Luo, Z.-Q., Tseng, P.: A coordinate gradient descent method for nonsmooth separable minimization. J. Optim. Theory Appl. 72(1), 20 (2002)
Mosco, U.: Convergence of convex sets and of solutions of variational inequalities. Adv. Math. 3(4), 510–585 (1969)
Necoara, I., Nesterov, Y., Glineur, F.: Linear convergence of first order methods for non-strongly convex optimization. Math. Program. 175(1–2), 69–107 (2019)
Nocedal, J., Wright, S.: Numerical Optimization. Springer, Berlin (2006)
Qi, L., Chen, X.: A preconditioning proximal Newton method for nondifferentiable convex optimization. Math. Program. 76(3, Ser. B), 411–429 (1997)
Salzo, S.: The variable metric forward–backward splitting algorithm under mild differentiability assumptions. SIAM J. Optim. 27(4), 2153–2181 (2017)
Scheinberg, K., Tang, X.: Practical inexact proximal quasi-Newton method with global complexity analysis. Math. Program. 160(1–2, Ser. A), 495–529 (2016)
Taylor, A.B., Hendrickx, J.M., Glineur, F.: Exact worst-case convergence rates of the proximal gradient method for composite convex minimization. J. Optim. Theory Appl. 178(2), 455–476 (2018)
Tseng, P.: A modified forward–backward splitting method for maximal monotone mappings. SIAM J. Control Optim. 38(2), 431–446 (2000)
Tseng, P., Yun, S.A.: Coordinate gradient descent method for nonsmooth separable minimization. Math. Program. 117, 387–423 (2009)
Wei, Z., Qi, L.: Convergence analysis of a proximal Newton method. Numer. Funct. Anal. Optim. 17(3–4), 463–472 (1996)
Yuan, G.-X., Ho, C.-H., Lin, C.-J.: An improved glmnet for l1-regularized logistic regression. J. Mach. Learn. Res. 13(Jun), 1999–2030 (2012)
Yue, M.-C., Zhou, Z., So, A.M.C.: A family of inexact sqa methods for non-smooth convex minimization with provable convergence guarantees based on the Luo–Tseng error bound property. Math. Program. 174(1–2), 327–358 (2019)
Zhang, H.: New analysis of linear convergence of gradient-type methods via unifying error bound conditions. Math. Program. 180(1), 371–416 (2020)
Zolezzi, T.: On equiwellset minimum problems. Appl. Math. Optim. 4(3):209–223, (1977/78)
Acknowledgements
We are grateful for the support of the National Natural Science Foundation of China (No. 11971480 and 11501569). We are also obliged to the anonymous reviewers for their insightful comments.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Peng, W., Zhang, H., Zhang, X. et al. Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions. J Glob Optim 78, 69–89 (2020). https://doi.org/10.1007/s10898-020-00892-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10898-020-00892-1