Abstract
In this paper, we present a new memory gradient method such that the direction generated by this method provides a sufficient descent direction for the objective function at every iteration. Then, we analyze its global convergence under mild conditions and convergence rate for uniformly convex functions. Finally, we report some numerical results to show the efficiency of the proposed method.
Similar content being viewed by others
References
Pardalos P.M., Resende M.G.C.: Handbook of Applied Optimization. Oxford University Press, Oxford (2002)
Nocedal J., Wright S.J.: Numerical Optimization (Second Edn). Springer Series in Operations Research, Springer, New York (2006)
Fletcher R., Reeves C.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)
Polyak B.T.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9, 94–112 (1969)
Gilbert J.C., Nocedal J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Dai Y.H., Yuan Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)
Al-Baali M.: Descent Property and Global Convergence of the Fletcher-Reeves Method with Inexact Line Search. IMA J. Num. Anal. 5, 121–124 (1985)
Dai Y.H., Yuan Y.: Convergence properties of the Fletcher-Reeves method. IMA J. Num. Anal. 16, 155–164 (1996)
Powell M.J.D.: Convergence properties of algorithms for nonlinear optimization. SIAM Rev. 28, 487–500 (1986)
Grippo L., Lucidi S.: A globally convergent version of the Polark-Ribière conjugate gradient method. Math. Prog. 77, 375–391 (1997)
Hu Y.F., Storey C.: Global convergence result for conjugate gradient methods. J. Optim. Theor. Appl. 71(2), 399–405 (1991)
Floudas, C.A., Pardalos, P.M. (eds): Encyclopedia of Optimization. 2nd edn. Springer, Berlin (2009)
Wolfe M.A., Viazminsky C.: Supermemory descent methods for unconstrained minimization. J. Optim. Theor. Appl. 18(4), 455–468 (1976)
Shi Z.: A new memory gradient under exact line search. Asia-Pacific J. Oper. Res. 20(2), 275–284 (2003)
Shi Z.: Supermemory gradient method for unconstrained optimization. J. Eng. Math. 17(2), 99–104 (2000)
Shi Z.: On memory gradient method with trust region for unconstrained optimization. Num. Algortm. 41, 173–196 (2006)
Cantrell W.: Relation between the memory gradient method and the Fletcher-Revees method. J. Optim. Theor. Appl. 4(1), 67–71 (1969)
Ou, Y., Wang, G.: A new supermemory gradient method for unconstrained optimization problems. Optim. Lett. doi:10.1007/s11590-011-0328-9 (2011)
Narushima Y., Yabe H.: Global convergence of a memory gradient method for unconstrained optimization. Computational Optim. Appl. 35(3), 325–346 (2006)
Moré J.J., Garbow B.S., Hillstrom K.E.: Testing unconstrained optimization software. ACM Trans. Math. Softw. 7, 17–41 (1981)
Andrei N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Zheng, Y., Wan, Z. A new variant of the memory gradient method for unconstrained optimization. Optim Lett 6, 1643–1655 (2012). https://doi.org/10.1007/s11590-011-0355-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-011-0355-6