Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

A second-order gradient method for convex minimization

  • Original Article
  • Published:
Boletín de la Sociedad Matemática Mexicana Aims and scope Submit manuscript

    We’re sorry, something doesn't seem to be working properly.

    Please try refreshing the page. If that doesn't work, please contact support so we can address the problem.

Abstract

This work addresses the strictly convex unconstrained minimization problem via a modified version of the gradient method. The proposal is a line search method that uses a search direction based on the gradient method. This new direction is constructed by a mixture of the negative direction of the gradient with another particular direction that uses second-order information. Unlike Newton-type methods, our algorithm does not need to compute the inverse of the Hessian of the objective function. We analyze the global convergence under an exact line search. A numerical study is carried out, to illustrate the numerical effectiveness of the method by comparing it with some conjugate gradient methods and also with the Barzilai–Borwein gradient method both in quadratic and non-linear problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  1. Cauchy, A.: Méthode générale pour la résolution des systemes déquations simultanées. Comp. Rend. Sci. Paris 25(1847), 536–538 (1847)

    Google Scholar 

  2. Hestenes Magnus Rudolph and Stiefel Eduard: Methods of Conjugate Gradients for Solving Linear Systems, vol. 49. NBS, Washington (1952)

    MATH  Google Scholar 

  3. Roger, F., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)

    Article  MathSciNet  Google Scholar 

  4. Nocedal, J., Wright, S.J.: Numerical optimization, 2nd edn. Springer, New York (2006)

    MATH  Google Scholar 

  5. Jonathan, B., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)

    Article  MathSciNet  Google Scholar 

  6. Yu-Hong, D., Ya-Xiang, Y.: Alternate minimization gradient method. IMA J. Numer. Anal. 23(3), 377–393 (2003)

    Article  MathSciNet  Google Scholar 

  7. Yu-Hong, D.: Alternate step gradient method. Optimization 52(4–5), 395–415 (2003)

    MathSciNet  MATH  Google Scholar 

  8. Friedlander, A., Martínez, J.M., Molina, B., Raydan, M.: Gradient method with retards and generalizations. SIAM J. Numer. Anal. 36(1), 275–289 (1998)

    Article  MathSciNet  Google Scholar 

  9. Marcos, R., Svaiter, B.F.: Relaxed steepest descent and cauchy-barzilai-borwein method. Comput. Optim. Appl. 21(2), 155–167 (2002)

    Article  MathSciNet  Google Scholar 

  10. Yu-Hong, D., Roger, F.: On the asymptotic behaviour of some new gradient methods. Math. Program. 103(3), 541–559 (2005)

    Article  MathSciNet  Google Scholar 

  11. Giacomo, F., Luca, Z., Gaetano, Z., et al.: New adaptive stepsize selections in gradient methods. J. Ind. Manag. Optim. 4(2), 299–312 (2008)

    MathSciNet  MATH  Google Scholar 

  12. Luc, P., Anatoly, Z.: Gradient algorithms for quadratic optimization with fast convergence rates. Comput. Optim. Appl. 50(3), 597–617 (2011)

    Article  MathSciNet  Google Scholar 

  13. Roger, F.: A limited memory steepest descent method. Math. Program. 135(1–2), 413–436 (2012)

    MathSciNet  MATH  Google Scholar 

  14. De Asmundis, R., Di Serafino, D., Hager, W.W., Toraldo, G., Zhang, H.: An efficient gradient method using the yuan steplength. Comput. Optim. Appl. 59(3), 541–563 (2014)

    Article  MathSciNet  Google Scholar 

  15. Oviedo, L.H.F.: A delayed weighted gradient method for strictly convex quadratic minimization. Comput. Optim. Appl. 74(3), 729–746 (2019)

    Article  MathSciNet  Google Scholar 

  16. Oviedo, H., Dalmau, O., Herrera, R.: Two novel gradient methods with optimal step sizes. J. Comput. Math. 39(3), 375 (2021)

    Article  MathSciNet  Google Scholar 

  17. Oviedo, H., Dalmau, O., Herrera, R.: A hybrid gradient method for strictly convex quadratic programming. Numer. Linear Algebra Appl. 28(4), e2360 (2020). https://doi.org/10.1002/nla.2360

    Article  MathSciNet  MATH  Google Scholar 

  18. Oviedo, H., Dalmau, O., Herrera, R.: An accelerated minimal gradient method with momentum for convex quadratic optimization. BIT Numer. Math. (2021). https://doi.org/10.1007/s10543-021-00886-9

    Article  MATH  Google Scholar 

  19. Marcos, R.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7(1), 26–33 (1997)

    Article  MathSciNet  Google Scholar 

  20. Roger, F.: Practical methods of optimization, vol. 80. Wiley, New York (1987)

    Google Scholar 

  21. Hager, W.W., Hongchao, Z.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  Google Scholar 

  22. Yu-Hong, D., Yaxiang, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MathSciNet  Google Scholar 

  23. Hongchao, Z., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14(4), 1043–1056 (2004)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The author would like to thank MSc. Shaday Guerrero for his careful reading of the manuscript which improved the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Harry Oviedo.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

The author was financially supported by FGV (Fundação Getulio Vargas) through the excellence post–doctoral fellowship program.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Oviedo, H. A second-order gradient method for convex minimization. Bol. Soc. Mat. Mex. 27, 66 (2021). https://doi.org/10.1007/s40590-021-00375-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s40590-021-00375-7

Keywords

Mathematics Subject Classification