Abstract
This paper presents a new adaptive steplength for the gradient method, which exploits the advantages the two steplengths proposed by Barzilai and Borwein. Particularly, the proposed steplength is based on an optimal step size determined by minimizing a merit function constructed as a convex combination of the cost function and its gradient norm. The global convergence and some theoretical properties related to the proposed gradient method are provided. Finally, some computational studies are included to highlight the efficiency and effectiveness of the new approach.
Similar content being viewed by others
References
Figueiredo, M.A.T., Nowak, R.D., Wright, S.J.: Gradient projection for sparse reconstruction: application to compressed sensing and other inverse problems. IEEE J. Sel. Topics Signal Process. 1(4), 586–597 (2007)
Huang, H.: Efficient reconstruction of 2D images and 3D surfaces. PhD thesis, University of British Columbia (2008)
Dalmau-Cedeno, O., Oviedo, H.: A projection method for optimization problems on the stiefel manifold. In: Mexican Conference on Pattern Recognition, pp. 84–93. Springer, Berlin (2017)
Loosli, G., Canu, S.: Quadratic programming and machine learning—large scale problems and sparsity. In: Optimization in Signal and Image Processing (2010)
Oviedo, H., Dalmau, O.: A scaled gradient projection method for minimization over the Stiefel manifold. In: Mexican International Conference on Artificial Intelligence, pp. 239–250. Springer, Berlin (2019)
di Serafino, D., Toraldo, G., Viola, M., Barlow, J.: A two-phase gradient method for quadratic programming problems with a single linear constraint and bounds on the variables. SIAM J. Optim. 28(4), 2809–2838 (2018)
Crisci, S., Ruggiero, V., Zanni, L.: Steplength selection in gradient projection methods for box-constrained quadratic programs. Appl. Math. Comput. 356, 312–327 (2019)
De Asmundis, R., Di Serafino, D., Landi, G.: On the regularizing behavior of the SDA and SDC gradient methods in the solution of linear ill-posed problems. J. Comput. Appl. Math. 302, 81–93 (2016)
Moré, J.J., Toraldo, G.: Algorithms for bound constrained quadratic programming problems. Numer. Math. 55(4), 377–400 (1989)
Hestenes, M.R., Stiefel, E.: Methods of Conjugate Gradients for Solving Linear Systems, vol. 49. NBS, Washington, DC (1952)
Leon, H.F.O.: A delayed weighted gradient method for strictly convex quadratic minimization. Comput. Optim. Appl. 74(3), 729–746 (2019)
Oviedo, H., Dalmau, O., Herrera, R.: A hybrid gradient method for strictly convex quadratic programming. In: Numerical Linear Algebra with Applications, p. e2360 (2020 )
Oviedo, H., Dalmau, O., Herrera, R.: An accelerated minimal gradient method with momentum for strictly convex quadratic optimization. BIT Numer. Math. (2021)
Cauchy, A.: Méthode générale pour la résolution des systemes d’équations simultanées. C. R. Sci. Paris 25(1847), 536–538 (1847)
Luenberger, D.G., Ye, Y., et al.: Linear and Nonlinear Programming, vol. 2. Springer, Berlin (1984)
Mark, A.K., Krein, S.G.: An iteration process with minimal residuals. Mat. Sbornik 73(2), 315–334 (1952)
Zou, Q., Magoules, F.: Fast gradient methods with alignment for symmetric linear systems without using Cauchy step. arXiv preprint arXiv:1909.01479 (2019)
Di Serafino, D., Ruggiero, V., Toraldo, G., Zanni, L.: On the steplength selection in gradient methods for unconstrained optimization. Appl. Math. Comput. 318, 176–195 (2018)
Huang, Y., Dai, Y.-H., Liu, X.-W., Zhang, H.: On the asymptotic convergence and acceleration of gradient methods. arXiv preprint arXiv:1908.07111 (2019)
Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)
Raydan, M.: On the Barzilai and Borwein choice of steplength for the gradient method. IMA J. Numer. Anal. 13(3), 321–326 (1993)
Dai, Y.-H., Liao, L.-Z.: R-linear convergence of the Barzilai and Borwein gradient method. IMA J. Numer. Anal. 22(1), 1–10 (2002)
Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7(1), 26–33 (1997)
Frassoldati, G., Zanni, L., Zanghirati, G.: New adaptive stepsize selections in gradient methods. J. Ind. Manag. Optim. 4(2), 299–312 (2008)
Dai, Y.-H., Yuan, Y.-X.: Alternate minimization gradient method. IMA J. Numer. Anal. 23(3), 377–393 (2003)
Dai, Y.-H.: Alternate step gradient method. Optimization 52(4–5), 395–415 (2003)
De Asmundis, R., Di Serafino, D., Hager, W.W., Toraldo, G., Zhang, H.: An efficient gradient method using the Yuan steplength. Computat. Optim. Appl. 59(3), 541–563 (2014)
Friedlander, A., Martínez, J.M., Molina, B., Raydan, M.: Gradient method with retards and generalizations. SIAM J. Numer. Anal. 36(1), 275–289 (1998)
Oviedo, H., Dalmau, O., Herrera, R.: Two novel gradient methods with optimal step sizes. J. Comput. Math. 39(3), 375 (2021)
Yuan, Y.: A new stepsize for the steepest descent method. J. Comput. Math. 149–156 (2006)
Huang, Y., Dai, Y.-H., Liu, X.-W., Zhang, H.: Gradient methods exploiting spectral properties. In: Optimization Methods and Software, pp. 1–25 (2020)
Dai, Y.-H., Huang, Y., Liu, X.-W.: A family of spectral gradient methods for optimization. Computat. Optim. Appl. 74(1), 43–65 (2019)
Zou, Q., Magoulès, F.: A new cyclic gradient method adapted to large-scale linear systems. In: 2018 17th International Symposium on Distributed Computing and Applications for Business Engineering and Science (DCABES), pp. 196–199. IEEE (2018 )
Zhou, B., Gao, L., Dai, Y.-H.: Gradient methods with adaptive step-sizes. Comput. Optim. Appl. 35(1), 69–86 (2006)
Liu, Z., Liu, H., Dong, X.: An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem. Optimization 67(3), 427–440 (2018)
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
The author was financially supported by FGV (Fundação Getulio Vargas) through the excellence post-doctoral fellowship program.
Rights and permissions
About this article
Cite this article
Oviedo, H. A cyclic delayed weighted steplength for the gradient method. Ricerche mat 73, 873–885 (2024). https://doi.org/10.1007/s11587-021-00646-5
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11587-021-00646-5