Abstract
An accelerated three-term conjugate gradient method is proposed, in which the search direction can satisfy the sufficient descent condition as well as extended Dai–Liao conjugacy condition. Different from the existent methods, a dynamical compensation strategy in our proposed method is considered, that is Li–Fushikuma-type quasi-Newton equation is satisfied as much as possible, otherwise, to some extent, the singular values of iteration matrix of search directions will adaptively clustered, which substantially benefits acceleration the convergence or reduction in the condition number of iteration matrix. Global convergence is established under mild conditions for general objective functions. We also report some numerical results to show its efficiency.
Similar content being viewed by others
Notes
When the memory is zero in CG_DESCENT 6.0, the code reduces to CG_DESCENT 5.3(except for minor changes in default parameters), see Page 2165, Ref [55]
References
Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952)
Dai, Y., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)
Andrei, N.: Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization. Bull. Malays. Math. Sci. Soc. 34(2), 319–330 (2011)
Al-Baali, M.: Descent property and global convergence of the Fletcher–Reeves method with inexact line search. IMA J. Numer. Anal. 5(1), 121–124 (1985)
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)
Dai, Y., Kou, C.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)
Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21(1), 212–230 (2011)
Zhang, L., Zhou, W., Li, D.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22(4), 697–711 (2007)
Al-Baali, M., Narushima, Y., Yabe, H.: A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization. Comput. Optim. Appl. 60(1), 89–110 (2015)
Yu, G., Guan, L., Chen, W.: Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization. Optim. Methods Softw. 23(2), 275–293 (2008)
Zhang, L., Zhou, W., Li, D.: A descent modified Polak–Ribi\(\grave{e}\)re–Polyak conjugate gradientmethod and its global convergence. IMA J. Numer. Anal. 26(4), 629–640 (2006)
Zhang, L., Zhou, W., Li, D.: Global convergence of a modified Fletcher–Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 104(4), 561–572 (2006)
Cheng, W., Li, D.: An active set modified Polak–Ribi\(\grave{e}\)re–Polyak method for large-scale nonlinear bound constrained optimization. J. Optim. Theory Appl. 155(3), 1084–1094 (2012)
Dong, X., Liu, H., Xu, Y., Yang, X.: Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence. Optim. Lett. 9(7), 1421–1432 (2015)
Yuan, G., Li, Y., Li, Y.: A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations. J. Optim. Theory Appl. 168(1), 129–152 (2016)
Yuan, G., Wei, Z., Lu, X.: Global convergence of BFGS and PRP methods under a modified weak Wolfe–Powell line search. Appl. Math. Model. 47, 811–825 (2017)
Yuan, G., Sheng, Z., Wang, B., Hu, W., Li, C.: The global convergence of a modified BFGS method for nonconvex functions. J. Comp. Appl. Math. 327, 274–294 (2018)
Deng, S., Wan, Z.: A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems. Appl. Numer. Math. 92(3), 70–81 (2015)
Dai, Z., Chen, X., Wen, F.: A modified Perry’s conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations. Appl. Math. Comput. 270(7), 378–386 (2015)
Liu, J., Li, S.: Spectral gradient method for impulse noise removal. Optim. Lett. 9(7), 1341–1351 (2015)
Dai, Z., Wen, F.: Some improved sparse and stable portfolio optimization problems. Financ Res. Lett. (2018). https://doi.org/10.1016/j.frl.2018.02.026
Yu, J., Li, M., Wang, Y., He, G.: A decomposition method for large-scale box constrained optimization. Appl. Math. Comput. 231(12), 9–15 (2014)
Sun, L., He, G., Wang, Y., Zhou, C.: An accurate active set newton algorithm for large scale bound constrained optimization. Appl. Math. 56(3), 297–314 (2011)
Sun, L., Fang, L., He, G.: An active set strategy based on the multiplier function or the gradient. Appl. Math. 55(4), 291–304 (2010)
Zheng, F., Han, C., Wang, Y.: Parallel SSLE algorithm for large scale constrained optimization. Appl. Math. Comput. 217(12), 5377–5384 (2011)
Sun, Z., Li, H., Wang, J., Tian, Y.: Two modified spectral conjugate gradient methods and their global convergence for unconstrained optimization. Int. J Comput. Math. (2018). https://doi.org/10.1080/00207160.2017.1366457
Perry, A.: A modified conjugate gradient algorithm. Oper. Res. 26, 1073–1078 (1976)
Dai, Y., Liao, L.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)
Li, D., Fukushima, M.: A modified BFGS method and its global convergence in minimizationp. J. Comput. Appl. Math. 129, 15–35 (2001)
Zhou, W., Zhang, L.: A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21(5), 707–714 (2006)
Yabe, H., Takano, M.: Global convergence properties of nonlinear conjugate gradient methods with modiffied secant condition. Comput. Optim. Appl. 28, 203–225 (2004)
Ford, J.A., Narushima, Y., Yabe, H.: Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput. Optim. Appl. 40(2), 191–216 (2008)
Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comp. Appl. Math. 202(2), 523–539 (2007)
Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)
Andrei, N.: On three-term conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 219(11), 6316–6327 (2013)
Andrei, N.: A simple three-term conjugate gradient algorithm for unconstrained optimization. J. Comp. Appl. Math. 241, 19–29 (2013)
Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204(3), 410–420 (2010)
Andrei, N.: Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update. J. Comp. Appl. Math. 325, 149–164 (2017)
Al-Baali, M., Caliciotti, A., Fasano, G., Roma, M.: Exploiting damped techniques for nonlinear conjugate gradient methods. Math. Methods Oper. Res. (2018). https://doi.org/10.1007/s00186-017-0593-1
Babaie-Kafaki, S., Reza, G.: A descent family of Dai–Liao conjugate gradient methods. Optim. Methods Softw. 29(3), 583–591 (2014)
Babaie-Kafaki, S., Reza, G.: The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)
Andrei, N.: A Dai–Liao conjugate gradient algorithm with clustering of eigenvalues. Numer. Algorithms 77(4), 1273–1282 (2018)
Babaie-Kafaki, S.: On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae. J. Optim. Theory Appl. 167(1), 91–101 (2015)
Fatemi, M.: A new efficient conjugate gradient method for unconstrained optimization. J. Comp. Appl. Math. 300, 207–216 (2016)
Dong, X., Liu, H., He, Y.: A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition. J. Optim. Theory Appl. 165(1), 225–241 (2015)
Dong, X., Liu, H., He, Y., Yang, X.: A modified Hestenes–Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition. J. Comp. Appl. Math. 281, 239–249 (2015)
Dong, X., Han, D., Reza, G., Li, X., Dai, Z.: Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination. Optimization. 66(5), 759–776 (2017)
Dong, X., Liu, H., He, Y.: New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction. Appl. Math. Comput. 269, 606–617 (2015)
Deng, S., Wan, Z.: A three-term conjugate gradient algorithm for large scale unconstrained optimization problems. Appl. Numer. Math. 92, 70–81 (2015)
Babaie-Kafaki, S., Ghanbari, R.: Two modified three-term conjugate gradient method with sufficient descent property. Optim. Lett. 8(8), 2285–2297 (2014)
Hager, W.W., Zhang, H.: The limited memory conjugate gradient method. SIAM J. Optim. 23(4), 2150–2168 (2013)
Babaie-Kafaki, S., Ghanbari, R.: Two modified three-term conjugate gradient methods with sufficient descent property. Optim. Lett. 8(8), 2285–2297 (2014)
Babaie-Kafaki, S., Fatemi, M.: A modified two-point stepsize gradient algorithm for unconstrained minimization. Optim. Methods Softw. 28(5), 1040–105 (2013)
Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11(2), 226–235 (1969)
Bongartz, I., Conn, A.R., Gould, N., Toint, P.L.: Cute: constrained and unconstrained testing environment. ACM Trans. Math. Softw. 50(124), 123–160 (1993)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program 91(2), 201–213 (2002)
Acknowledgements
We would like to thank Professor W.W. Hager and Professor H. Zhang for the CG_DESCENT code. We are grateful to the anonymous referees and editor for their useful comments, which have made the paper clearer and more comprehensive than the earlier version. This work is supported by the National Natural Science Foundation of China (11601012, 11431002, 11625105, 71771030, 11861002), Natural Science Foundation of Ningxia (NZ17103), Natural Science Foundation of Guangxi (2018GXNSFAA138169), Guangxi Key Laboratory of Cryptography and Information Security (GCIS201708), Project of Shandong Province Higher Education Science and Technology (J17KA166), the key project of North Minzu University (ZDZX201804).
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Alexandre Cabot.
Rights and permissions
About this article
Cite this article
Dong, X., Han, D., Dai, Z. et al. An Accelerated Three-Term Conjugate Gradient Method with Sufficient Descent Condition and Conjugacy Condition. J Optim Theory Appl 179, 944–961 (2018). https://doi.org/10.1007/s10957-018-1377-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10957-018-1377-3
Keywords
- Three-term conjugate gradient method
- Sufficient descent condition
- Conjugacy condition
- Global convergence
- Condition number