Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

An Accelerated Three-Term Conjugate Gradient Method with Sufficient Descent Condition and Conjugacy Condition

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

An accelerated three-term conjugate gradient method is proposed, in which the search direction can satisfy the sufficient descent condition as well as extended Dai–Liao conjugacy condition. Different from the existent methods, a dynamical compensation strategy in our proposed method is considered, that is Li–Fushikuma-type quasi-Newton equation is satisfied as much as possible, otherwise, to some extent, the singular values of iteration matrix of search directions will adaptively clustered, which substantially benefits acceleration the convergence or reduction in the condition number of iteration matrix. Global convergence is established under mild conditions for general objective functions. We also report some numerical results to show its efficiency.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. http://www.math.ufl.edu/~hager/papers/CG

  2. When the memory is zero in CG_DESCENT 6.0, the code reduces to CG_DESCENT 5.3(except for minor changes in default parameters), see Page 2165, Ref [55]

References

  1. Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)

    Article  MathSciNet  Google Scholar 

  2. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)

    Article  MathSciNet  Google Scholar 

  3. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952)

    Article  MathSciNet  Google Scholar 

  4. Dai, Y., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MathSciNet  Google Scholar 

  5. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  6. Andrei, N.: Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization. Bull. Malays. Math. Sci. Soc. 34(2), 319–330 (2011)

    MathSciNet  MATH  Google Scholar 

  7. Al-Baali, M.: Descent property and global convergence of the Fletcher–Reeves method with inexact line search. IMA J. Numer. Anal. 5(1), 121–124 (1985)

    Article  MathSciNet  Google Scholar 

  8. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)

    Article  MathSciNet  Google Scholar 

  9. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  Google Scholar 

  10. Dai, Y., Kou, C.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    Article  MathSciNet  Google Scholar 

  11. Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21(1), 212–230 (2011)

    Article  MathSciNet  Google Scholar 

  12. Zhang, L., Zhou, W., Li, D.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22(4), 697–711 (2007)

    Article  MathSciNet  Google Scholar 

  13. Al-Baali, M., Narushima, Y., Yabe, H.: A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization. Comput. Optim. Appl. 60(1), 89–110 (2015)

    Article  MathSciNet  Google Scholar 

  14. Yu, G., Guan, L., Chen, W.: Spectral conjugate gradient methods with sufficient descent property for large-scale unconstrained optimization. Optim. Methods Softw. 23(2), 275–293 (2008)

    Article  MathSciNet  Google Scholar 

  15. Zhang, L., Zhou, W., Li, D.: A descent modified Polak–Ribi\(\grave{e}\)re–Polyak conjugate gradientmethod and its global convergence. IMA J. Numer. Anal. 26(4), 629–640 (2006)

    Article  MathSciNet  Google Scholar 

  16. Zhang, L., Zhou, W., Li, D.: Global convergence of a modified Fletcher–Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 104(4), 561–572 (2006)

    Article  MathSciNet  Google Scholar 

  17. Cheng, W., Li, D.: An active set modified Polak–Ribi\(\grave{e}\)re–Polyak method for large-scale nonlinear bound constrained optimization. J. Optim. Theory Appl. 155(3), 1084–1094 (2012)

    Article  MathSciNet  Google Scholar 

  18. Dong, X., Liu, H., Xu, Y., Yang, X.: Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence. Optim. Lett. 9(7), 1421–1432 (2015)

    Article  MathSciNet  Google Scholar 

  19. Yuan, G., Li, Y., Li, Y.: A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations. J. Optim. Theory Appl. 168(1), 129–152 (2016)

    Article  MathSciNet  Google Scholar 

  20. Yuan, G., Wei, Z., Lu, X.: Global convergence of BFGS and PRP methods under a modified weak Wolfe–Powell line search. Appl. Math. Model. 47, 811–825 (2017)

    Article  MathSciNet  Google Scholar 

  21. Yuan, G., Sheng, Z., Wang, B., Hu, W., Li, C.: The global convergence of a modified BFGS method for nonconvex functions. J. Comp. Appl. Math. 327, 274–294 (2018)

    Article  MathSciNet  Google Scholar 

  22. Deng, S., Wan, Z.: A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems. Appl. Numer. Math. 92(3), 70–81 (2015)

    Article  MathSciNet  Google Scholar 

  23. Dai, Z., Chen, X., Wen, F.: A modified Perry’s conjugate gradient method-based derivative-free method for solving large-scale nonlinear monotone equations. Appl. Math. Comput. 270(7), 378–386 (2015)

    MathSciNet  Google Scholar 

  24. Liu, J., Li, S.: Spectral gradient method for impulse noise removal. Optim. Lett. 9(7), 1341–1351 (2015)

    Article  MathSciNet  Google Scholar 

  25. Dai, Z., Wen, F.: Some improved sparse and stable portfolio optimization problems. Financ Res. Lett. (2018). https://doi.org/10.1016/j.frl.2018.02.026

    Article  Google Scholar 

  26. Yu, J., Li, M., Wang, Y., He, G.: A decomposition method for large-scale box constrained optimization. Appl. Math. Comput. 231(12), 9–15 (2014)

    MathSciNet  MATH  Google Scholar 

  27. Sun, L., He, G., Wang, Y., Zhou, C.: An accurate active set newton algorithm for large scale bound constrained optimization. Appl. Math. 56(3), 297–314 (2011)

    Article  MathSciNet  Google Scholar 

  28. Sun, L., Fang, L., He, G.: An active set strategy based on the multiplier function or the gradient. Appl. Math. 55(4), 291–304 (2010)

    Article  MathSciNet  Google Scholar 

  29. Zheng, F., Han, C., Wang, Y.: Parallel SSLE algorithm for large scale constrained optimization. Appl. Math. Comput. 217(12), 5377–5384 (2011)

    MathSciNet  MATH  Google Scholar 

  30. Sun, Z., Li, H., Wang, J., Tian, Y.: Two modified spectral conjugate gradient methods and their global convergence for unconstrained optimization. Int. J Comput. Math. (2018). https://doi.org/10.1080/00207160.2017.1366457

    Article  MathSciNet  Google Scholar 

  31. Perry, A.: A modified conjugate gradient algorithm. Oper. Res. 26, 1073–1078 (1976)

    Article  MathSciNet  Google Scholar 

  32. Dai, Y., Liao, L.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)

    Article  MathSciNet  Google Scholar 

  33. Li, D., Fukushima, M.: A modified BFGS method and its global convergence in minimizationp. J. Comput. Appl. Math. 129, 15–35 (2001)

    Article  MathSciNet  Google Scholar 

  34. Zhou, W., Zhang, L.: A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21(5), 707–714 (2006)

    Article  MathSciNet  Google Scholar 

  35. Yabe, H., Takano, M.: Global convergence properties of nonlinear conjugate gradient methods with modiffied secant condition. Comput. Optim. Appl. 28, 203–225 (2004)

    Article  MathSciNet  Google Scholar 

  36. Ford, J.A., Narushima, Y., Yabe, H.: Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput. Optim. Appl. 40(2), 191–216 (2008)

    Article  MathSciNet  Google Scholar 

  37. Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comp. Appl. Math. 202(2), 523–539 (2007)

    Article  MathSciNet  Google Scholar 

  38. Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)

    Article  MathSciNet  Google Scholar 

  39. Andrei, N.: On three-term conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 219(11), 6316–6327 (2013)

    MathSciNet  MATH  Google Scholar 

  40. Andrei, N.: A simple three-term conjugate gradient algorithm for unconstrained optimization. J. Comp. Appl. Math. 241, 19–29 (2013)

    Article  MathSciNet  Google Scholar 

  41. Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204(3), 410–420 (2010)

    Article  MathSciNet  Google Scholar 

  42. Andrei, N.: Accelerated adaptive Perry conjugate gradient algorithms based on the self-scaling memoryless BFGS update. J. Comp. Appl. Math. 325, 149–164 (2017)

    Article  MathSciNet  Google Scholar 

  43. Al-Baali, M., Caliciotti, A., Fasano, G., Roma, M.: Exploiting damped techniques for nonlinear conjugate gradient methods. Math. Methods Oper. Res. (2018). https://doi.org/10.1007/s00186-017-0593-1

    Article  MATH  Google Scholar 

  44. Babaie-Kafaki, S., Reza, G.: A descent family of Dai–Liao conjugate gradient methods. Optim. Methods Softw. 29(3), 583–591 (2014)

    Article  MathSciNet  Google Scholar 

  45. Babaie-Kafaki, S., Reza, G.: The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234(3), 625–630 (2014)

    Article  MathSciNet  Google Scholar 

  46. Andrei, N.: A Dai–Liao conjugate gradient algorithm with clustering of eigenvalues. Numer. Algorithms 77(4), 1273–1282 (2018)

    Article  MathSciNet  Google Scholar 

  47. Babaie-Kafaki, S.: On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae. J. Optim. Theory Appl. 167(1), 91–101 (2015)

    Article  MathSciNet  Google Scholar 

  48. Fatemi, M.: A new efficient conjugate gradient method for unconstrained optimization. J. Comp. Appl. Math. 300, 207–216 (2016)

    Article  MathSciNet  Google Scholar 

  49. Dong, X., Liu, H., He, Y.: A self-adjusting conjugate gradient method with sufficient descent condition and conjugacy condition. J. Optim. Theory Appl. 165(1), 225–241 (2015)

    Article  MathSciNet  Google Scholar 

  50. Dong, X., Liu, H., He, Y., Yang, X.: A modified Hestenes–Stiefel conjugate gradient method with sufficient descent condition and conjugacy condition. J. Comp. Appl. Math. 281, 239–249 (2015)

    Article  MathSciNet  Google Scholar 

  51. Dong, X., Han, D., Reza, G., Li, X., Dai, Z.: Some new three-term Hestenes–Stiefel conjugate gradient methods with affine combination. Optimization. 66(5), 759–776 (2017)

    Article  MathSciNet  Google Scholar 

  52. Dong, X., Liu, H., He, Y.: New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction. Appl. Math. Comput. 269, 606–617 (2015)

    MathSciNet  Google Scholar 

  53. Deng, S., Wan, Z.: A three-term conjugate gradient algorithm for large scale unconstrained optimization problems. Appl. Numer. Math. 92, 70–81 (2015)

    Article  MathSciNet  Google Scholar 

  54. Babaie-Kafaki, S., Ghanbari, R.: Two modified three-term conjugate gradient method with sufficient descent property. Optim. Lett. 8(8), 2285–2297 (2014)

    Article  MathSciNet  Google Scholar 

  55. Hager, W.W., Zhang, H.: The limited memory conjugate gradient method. SIAM J. Optim. 23(4), 2150–2168 (2013)

    Article  MathSciNet  Google Scholar 

  56. Babaie-Kafaki, S., Ghanbari, R.: Two modified three-term conjugate gradient methods with sufficient descent property. Optim. Lett. 8(8), 2285–2297 (2014)

    Article  MathSciNet  Google Scholar 

  57. Babaie-Kafaki, S., Fatemi, M.: A modified two-point stepsize gradient algorithm for unconstrained minimization. Optim. Methods Softw. 28(5), 1040–105 (2013)

    Article  MathSciNet  Google Scholar 

  58. Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11(2), 226–235 (1969)

    Article  MathSciNet  Google Scholar 

  59. Bongartz, I., Conn, A.R., Gould, N., Toint, P.L.: Cute: constrained and unconstrained testing environment. ACM Trans. Math. Softw. 50(124), 123–160 (1993)

    MATH  Google Scholar 

  60. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program 91(2), 201–213 (2002)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

We would like to thank Professor W.W. Hager and Professor H. Zhang for the CG_DESCENT code. We are grateful to the anonymous referees and editor for their useful comments, which have made the paper clearer and more comprehensive than the earlier version. This work is supported by the National Natural Science Foundation of China (11601012, 11431002, 11625105, 71771030, 11861002), Natural Science Foundation of Ningxia (NZ17103), Natural Science Foundation of Guangxi (2018GXNSFAA138169), Guangxi Key Laboratory of Cryptography and Information Security (GCIS201708), Project of Shandong Province Higher Education Science and Technology (J17KA166), the key project of North Minzu University (ZDZX201804).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to XiaoLiang Dong.

Additional information

Communicated by Alexandre Cabot.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dong, X., Han, D., Dai, Z. et al. An Accelerated Three-Term Conjugate Gradient Method with Sufficient Descent Condition and Conjugacy Condition. J Optim Theory Appl 179, 944–961 (2018). https://doi.org/10.1007/s10957-018-1377-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-018-1377-3

Keywords

Mathematics Subject Classification

Navigation