Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Hybridization of accelerated gradient descent method

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

We present a gradient descent algorithm with a line search procedure for solving unconstrained optimization problems which is defined as a result of applying Picard-Mann hybrid iterative process on accelerated gradient descent S M method described in Stanimirović and Miladinović (Numer. Algor. 54, 503–520, 2010). Using merged features of both analyzed models, we show that new accelerated gradient descent model converges linearly and faster then the starting S M method which is confirmed trough displayed numerical test results. Three main properties are tested: number of iterations, CPU time and number of function evaluations. The efficiency of the proposed iteration is examined for the several values of the correction parameter introduced in Khan (2013).

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Andrei, N: An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numer. Algor. 42, 63–73 (2006)

    Article  MathSciNet  Google Scholar 

  2. Andrei, N.: An unconstrained optimization test functions collection. http://www.ici.ro/camo/journal/vol10/v10a10.pdf

  3. Armijo, L: Minimization of functions having Lipschitz first partial derivatives. Pac. J. Math. 6, 1–3 (1966)

    Article  MathSciNet  Google Scholar 

  4. Bauschke, H., Combettes, P. L.: Convex analysis and monotone operator theory in Hilbert spaces, p 144. Springer, Berlin (2011)

    Book  Google Scholar 

  5. Clarkson, J.A.: Uniformly convex spaces. Trans. Am. Math. Soc. 40, 396–414 (1936)

    Article  MathSciNet  Google Scholar 

  6. Dai, Y.H., Liao, L.Z: R-linear convergence of the Barzilai and Borwein gradient method. IMA J. Numer. Anal. 22, 1–10 (2002)

    Article  MathSciNet  Google Scholar 

  7. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MathSciNet  Google Scholar 

  8. Goldstein, A.A: On steepest descent. SIAM J. Control 3, 147–151 (1965)

    MathSciNet  MATH  Google Scholar 

  9. Ishikawa, S: Fixed points by a new iteration method. Proc. Am. Math. Soc. 44, 147–150 (1974)

    Article  MathSciNet  Google Scholar 

  10. Khan, S.H.: A Picard-Mann hybrid iterative process. Fixed Point Theory Appl. 2013, 69. Springer Open Journal (2013)

  11. Lemaréchal, C: A view of line search. In: Auslander, A., Oetti, W., Stoer, J. (eds.) , Optimization and Optimal Control, pp 59–78. Springer, Berlin (1981)

  12. Luenberg, D.G., Ye, Y.: Linear and nonlinear programming. Springer Science+Business Media LLC, New York (2008)

    Google Scholar 

  13. Mann, W.R.: Mean value methods in iterations. Proc. Am. Math. Soc. 4, 506–510 (1953)

    Article  MathSciNet  Google Scholar 

  14. Molina, B., Raydan, M.: Preconditioned Barzilai–Borwein method for the numerical solution of partial differential equations. Numer. Algor. 13, 45–60 (1996)

    Article  MathSciNet  Google Scholar 

  15. Moré, J.J., Thuente, D.J.: On line search algorithm with guaranteed sufficient decrease, Mathematics and Computer Science Division Preprint MCS-P153-0590. Argone National Laboratory, Argone (1990)

    Google Scholar 

  16. Nesterov, Ye.E.: A method for solving the convex programming problem with convergence rate O\((\frac {1}{k^2})\). Dokl. Acad. Nauk SSSR 269(3), 543–547 (1983)

    Google Scholar 

  17. Ortega, J.M., Rheinboldt, W.C.: Iterative solution of nonlinear equation in several variables. Academic Press, New York (1970)

    MATH  Google Scholar 

  18. Petrović M.J.: An accelerated double step size method in unconstrained optimization. Appl. Math. Comput. 250, 309–319 (2015)

    MathSciNet  MATH  Google Scholar 

  19. Petrović, M.J., Stanimirović, P.S.: Accelerated double direction method for solving unconstrained optimization problems. Math. Probl. Eng. 2014, Article ID 965104, 8 pp (2014)

  20. Stanimirović, P.S., Milovanović, G.V., Petrović, M.J.: A transformation of accelerated double step size method for unconstrained optimization. Math. Probl. Eng. 2015, Article ID 283679, 8 pp (2015)

  21. Potra, F.A., Shi, Y: Efficient line search algorithm for unconstrained optimization. J. Optim. Theory Appl. 85, 677–704 (1995)

    Article  MathSciNet  Google Scholar 

  22. Powell, M.J.D.: Some global convergence properties of a variable-metric algorithm for minimization without exact line search. AIAM-AMS Proc. Phila. 9, 53–72 (1976)

    Google Scholar 

  23. Picard, E: Memoire sur la theorie des equations aux derivees partielles et la methode des approximations successives. J. Math. Pures Appl. 6, 145–210 (1890)

    MATH  Google Scholar 

  24. Rockafellar, R.T.: Convex analysis. Princeton University Press, Princeton (1970)

    Book  Google Scholar 

  25. Shi, Z.-J.: Convergence of line search methods for unconstrained optimization. Appl. Math. Comput. 157, 393–405 (2004)

    MathSciNet  MATH  Google Scholar 

  26. Stanimirović, P.S., Miladinović, M.B.: Accelerated gradient descent methods with line search. Numer. Algor. 54, 503–520 (2010)

    Article  MathSciNet  Google Scholar 

  27. Sun, W., Yuan, Y.-X.: Optimization theory and methods: nonlinear programming. Springer, Berlin (2006)

    MATH  Google Scholar 

  28. Wolfe, P: Convergence conditions for ascent methods. SIAM Rev. 11, 226–235 (1968)

    Article  MathSciNet  Google Scholar 

  29. Zalinescu, C.: Convex analysis in general vector spaces. World Scientific, Singapore (2002)

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Milena Petrović.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Petrović, M., Rakočević, V., Kontrec, N. et al. Hybridization of accelerated gradient descent method. Numer Algor 79, 769–786 (2018). https://doi.org/10.1007/s11075-017-0460-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-017-0460-4

Keywords

Mathematics Subject Classification (2010)

Navigation