Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

A cyclic delayed weighted steplength for the gradient method

  • Published:
Ricerche di Matematica Aims and scope Submit manuscript

Abstract

This paper presents a new adaptive steplength for the gradient method, which exploits the advantages the two steplengths proposed by Barzilai and Borwein. Particularly, the proposed steplength is based on an optimal step size determined by minimizing a merit function constructed as a convex combination of the cost function and its gradient norm. The global convergence and some theoretical properties related to the proposed gradient method are provided. Finally, some computational studies are included to highlight the efficiency and effectiveness of the new approach.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  1. Figueiredo, M.A.T., Nowak, R.D., Wright, S.J.: Gradient projection for sparse reconstruction: application to compressed sensing and other inverse problems. IEEE J. Sel. Topics Signal Process. 1(4), 586–597 (2007)

    Article  ADS  Google Scholar 

  2. Huang, H.: Efficient reconstruction of 2D images and 3D surfaces. PhD thesis, University of British Columbia (2008)

  3. Dalmau-Cedeno, O., Oviedo, H.: A projection method for optimization problems on the stiefel manifold. In: Mexican Conference on Pattern Recognition, pp. 84–93. Springer, Berlin (2017)

  4. Loosli, G., Canu, S.: Quadratic programming and machine learning—large scale problems and sparsity. In: Optimization in Signal and Image Processing (2010)

  5. Oviedo, H., Dalmau, O.: A scaled gradient projection method for minimization over the Stiefel manifold. In: Mexican International Conference on Artificial Intelligence, pp. 239–250. Springer, Berlin (2019)

  6. di Serafino, D., Toraldo, G., Viola, M., Barlow, J.: A two-phase gradient method for quadratic programming problems with a single linear constraint and bounds on the variables. SIAM J. Optim. 28(4), 2809–2838 (2018)

    Article  MathSciNet  Google Scholar 

  7. Crisci, S., Ruggiero, V., Zanni, L.: Steplength selection in gradient projection methods for box-constrained quadratic programs. Appl. Math. Comput. 356, 312–327 (2019)

    MathSciNet  Google Scholar 

  8. De Asmundis, R., Di Serafino, D., Landi, G.: On the regularizing behavior of the SDA and SDC gradient methods in the solution of linear ill-posed problems. J. Comput. Appl. Math. 302, 81–93 (2016)

    Article  MathSciNet  Google Scholar 

  9. Moré, J.J., Toraldo, G.: Algorithms for bound constrained quadratic programming problems. Numer. Math. 55(4), 377–400 (1989)

    Article  MathSciNet  Google Scholar 

  10. Hestenes, M.R., Stiefel, E.: Methods of Conjugate Gradients for Solving Linear Systems, vol. 49. NBS, Washington, DC (1952)

    Google Scholar 

  11. Leon, H.F.O.: A delayed weighted gradient method for strictly convex quadratic minimization. Comput. Optim. Appl. 74(3), 729–746 (2019)

    Article  MathSciNet  Google Scholar 

  12. Oviedo, H., Dalmau, O., Herrera, R.: A hybrid gradient method for strictly convex quadratic programming. In: Numerical Linear Algebra with Applications, p. e2360 (2020 )

  13. Oviedo, H., Dalmau, O., Herrera, R.: An accelerated minimal gradient method with momentum for strictly convex quadratic optimization. BIT Numer. Math. (2021)

  14. Cauchy, A.: Méthode générale pour la résolution des systemes d’équations simultanées. C. R. Sci. Paris 25(1847), 536–538 (1847)

    Google Scholar 

  15. Luenberger, D.G., Ye, Y., et al.: Linear and Nonlinear Programming, vol. 2. Springer, Berlin (1984)

    Google Scholar 

  16. Mark, A.K., Krein, S.G.: An iteration process with minimal residuals. Mat. Sbornik 73(2), 315–334 (1952)

    MathSciNet  Google Scholar 

  17. Zou, Q., Magoules, F.: Fast gradient methods with alignment for symmetric linear systems without using Cauchy step. arXiv preprint arXiv:1909.01479 (2019)

  18. Di Serafino, D., Ruggiero, V., Toraldo, G., Zanni, L.: On the steplength selection in gradient methods for unconstrained optimization. Appl. Math. Comput. 318, 176–195 (2018)

    MathSciNet  Google Scholar 

  19. Huang, Y., Dai, Y.-H., Liu, X.-W., Zhang, H.: On the asymptotic convergence and acceleration of gradient methods. arXiv preprint arXiv:1908.07111 (2019)

  20. Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)

    Article  MathSciNet  Google Scholar 

  21. Raydan, M.: On the Barzilai and Borwein choice of steplength for the gradient method. IMA J. Numer. Anal. 13(3), 321–326 (1993)

    Article  MathSciNet  Google Scholar 

  22. Dai, Y.-H., Liao, L.-Z.: R-linear convergence of the Barzilai and Borwein gradient method. IMA J. Numer. Anal. 22(1), 1–10 (2002)

    Article  MathSciNet  CAS  Google Scholar 

  23. Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7(1), 26–33 (1997)

    Article  MathSciNet  Google Scholar 

  24. Frassoldati, G., Zanni, L., Zanghirati, G.: New adaptive stepsize selections in gradient methods. J. Ind. Manag. Optim. 4(2), 299–312 (2008)

    Article  MathSciNet  Google Scholar 

  25. Dai, Y.-H., Yuan, Y.-X.: Alternate minimization gradient method. IMA J. Numer. Anal. 23(3), 377–393 (2003)

    Article  MathSciNet  Google Scholar 

  26. Dai, Y.-H.: Alternate step gradient method. Optimization 52(4–5), 395–415 (2003)

    Article  MathSciNet  Google Scholar 

  27. De Asmundis, R., Di Serafino, D., Hager, W.W., Toraldo, G., Zhang, H.: An efficient gradient method using the Yuan steplength. Computat. Optim. Appl. 59(3), 541–563 (2014)

    Article  MathSciNet  Google Scholar 

  28. Friedlander, A., Martínez, J.M., Molina, B., Raydan, M.: Gradient method with retards and generalizations. SIAM J. Numer. Anal. 36(1), 275–289 (1998)

    Article  MathSciNet  Google Scholar 

  29. Oviedo, H., Dalmau, O., Herrera, R.: Two novel gradient methods with optimal step sizes. J. Comput. Math. 39(3), 375 (2021)

    Article  MathSciNet  Google Scholar 

  30. Yuan, Y.: A new stepsize for the steepest descent method. J. Comput. Math. 149–156 (2006)

  31. Huang, Y., Dai, Y.-H., Liu, X.-W., Zhang, H.: Gradient methods exploiting spectral properties. In: Optimization Methods and Software, pp. 1–25 (2020)

  32. Dai, Y.-H., Huang, Y., Liu, X.-W.: A family of spectral gradient methods for optimization. Computat. Optim. Appl. 74(1), 43–65 (2019)

    Article  MathSciNet  Google Scholar 

  33. Zou, Q., Magoulès, F.: A new cyclic gradient method adapted to large-scale linear systems. In: 2018 17th International Symposium on Distributed Computing and Applications for Business Engineering and Science (DCABES), pp. 196–199. IEEE (2018 )

  34. Zhou, B., Gao, L., Dai, Y.-H.: Gradient methods with adaptive step-sizes. Comput. Optim. Appl. 35(1), 69–86 (2006)

    Article  MathSciNet  Google Scholar 

  35. Liu, Z., Liu, H., Dong, X.: An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem. Optimization 67(3), 427–440 (2018)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Harry Oviedo.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

The author was financially supported by FGV (Fundação Getulio Vargas) through the excellence post-doctoral fellowship program.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Oviedo, H. A cyclic delayed weighted steplength for the gradient method. Ricerche mat 73, 873–885 (2024). https://doi.org/10.1007/s11587-021-00646-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11587-021-00646-5

Keywords

Mathematics Subject Classification

Navigation