Abstract
This work addresses the strictly convex unconstrained minimization problem via a modified version of the gradient method. The proposal is a line search method that uses a search direction based on the gradient method. This new direction is constructed by a mixture of the negative direction of the gradient with another particular direction that uses second-order information. Unlike Newton-type methods, our algorithm does not need to compute the inverse of the Hessian of the objective function. We analyze the global convergence under an exact line search. A numerical study is carried out, to illustrate the numerical effectiveness of the method by comparing it with some conjugate gradient methods and also with the Barzilai–Borwein gradient method both in quadratic and non-linear problems.
Similar content being viewed by others
References
Cauchy, A.: Méthode générale pour la résolution des systemes déquations simultanées. Comp. Rend. Sci. Paris 25(1847), 536–538 (1847)
Hestenes Magnus Rudolph and Stiefel Eduard: Methods of Conjugate Gradients for Solving Linear Systems, vol. 49. NBS, Washington (1952)
Roger, F., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)
Nocedal, J., Wright, S.J.: Numerical optimization, 2nd edn. Springer, New York (2006)
Jonathan, B., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8(1), 141–148 (1988)
Yu-Hong, D., Ya-Xiang, Y.: Alternate minimization gradient method. IMA J. Numer. Anal. 23(3), 377–393 (2003)
Yu-Hong, D.: Alternate step gradient method. Optimization 52(4–5), 395–415 (2003)
Friedlander, A., Martínez, J.M., Molina, B., Raydan, M.: Gradient method with retards and generalizations. SIAM J. Numer. Anal. 36(1), 275–289 (1998)
Marcos, R., Svaiter, B.F.: Relaxed steepest descent and cauchy-barzilai-borwein method. Comput. Optim. Appl. 21(2), 155–167 (2002)
Yu-Hong, D., Roger, F.: On the asymptotic behaviour of some new gradient methods. Math. Program. 103(3), 541–559 (2005)
Giacomo, F., Luca, Z., Gaetano, Z., et al.: New adaptive stepsize selections in gradient methods. J. Ind. Manag. Optim. 4(2), 299–312 (2008)
Luc, P., Anatoly, Z.: Gradient algorithms for quadratic optimization with fast convergence rates. Comput. Optim. Appl. 50(3), 597–617 (2011)
Roger, F.: A limited memory steepest descent method. Math. Program. 135(1–2), 413–436 (2012)
De Asmundis, R., Di Serafino, D., Hager, W.W., Toraldo, G., Zhang, H.: An efficient gradient method using the yuan steplength. Comput. Optim. Appl. 59(3), 541–563 (2014)
Oviedo, L.H.F.: A delayed weighted gradient method for strictly convex quadratic minimization. Comput. Optim. Appl. 74(3), 729–746 (2019)
Oviedo, H., Dalmau, O., Herrera, R.: Two novel gradient methods with optimal step sizes. J. Comput. Math. 39(3), 375 (2021)
Oviedo, H., Dalmau, O., Herrera, R.: A hybrid gradient method for strictly convex quadratic programming. Numer. Linear Algebra Appl. 28(4), e2360 (2020). https://doi.org/10.1002/nla.2360
Oviedo, H., Dalmau, O., Herrera, R.: An accelerated minimal gradient method with momentum for convex quadratic optimization. BIT Numer. Math. (2021). https://doi.org/10.1007/s10543-021-00886-9
Marcos, R.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7(1), 26–33 (1997)
Roger, F.: Practical methods of optimization, vol. 80. Wiley, New York (1987)
Hager, W.W., Hongchao, Z.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)
Yu-Hong, D., Yaxiang, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)
Hongchao, Z., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14(4), 1043–1056 (2004)
Acknowledgements
The author would like to thank MSc. Shaday Guerrero for his careful reading of the manuscript which improved the paper.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
The author was financially supported by FGV (Fundação Getulio Vargas) through the excellence post–doctoral fellowship program.
Rights and permissions
About this article
Cite this article
Oviedo, H. A second-order gradient method for convex minimization. Bol. Soc. Mat. Mex. 27, 66 (2021). https://doi.org/10.1007/s40590-021-00375-7
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s40590-021-00375-7