Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

On the final steps of Newton and higher order methods

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

The Newton method is one of the most used methods for solving nonlinear system of equations when the Jacobian matrix is nonsingular. The method converges to a solution with Q-order two for initial points sufficiently close to the solution. The method of Halley and the method of Chebyshev are among methods that have local and cubic rate of convergence. Combining these methods with a backtracking and curvilinear strategy for unconstrained optimization problems these methods have been shown to be globally convergent. The backtracking forces a strict decrease of the function of the unconstrained optimization problem. It is shown that no damping of the step in the backtracking routine is needed close to a strict local minimizer and the global method behaves as a local method. The local behavior for the unconstrained optimization problem is investigated by considering problems with two unknowns and it is shown that there are no significant differences in the region where the global method turn into a local method for second and third order methods. Further, the final steps to reach a predefined tolerance are investigated. It is shown that the region where the higher order methods terminate in one or two iteration is significantly larger than the corresponding region for Newton’s method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. It is convenient in this section to let subscripts denote components or elements and parenthesized superscripts to denote iteration count.

References

  1. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008)

    MathSciNet  MATH  Google Scholar 

  2. Apostolopoulou, M.S., Sotiropoulos, D.G., Botsaris, C.A.: A curvilinear method based on minimal-memory BFGS updates. Appl. Math. Comput. 217(2), 882–892 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  3. Babajee, D.K.R., Dauhoo, M.Z.: An analysis of the properties of the variants of Newtons method with third order convergence. Appl. Math. Comput. 183(1), 659–684 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  4. Bader, B.W., Schnabel, R.B.: Curvilinear linesearch for tensor methods. SIAM J. Sci. Comput. 25(2), 604–622 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  5. Buckley, A.R.: Test functions for unconstrained minimization. Technical Report 1989CS-3, Mathematics, Statistics and Computing Centre, Dalhousie University, Halifax (CDN) (1994) (an updated version)

  6. Chen, D., Argyros, I.K., Qian, Q.S.: A note on the Halley method in Banach spaces. Appl. Math. Comput. 58, 215–224 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  7. Cheng, S.H., Higham, N.J.: A modified Cholesky algorithm based on a symmetric indefinite factorization. SIAM J. Matrix Anal. Appl. 19(4), 1097–1110 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  8. Conforti, D., Grandinetti, L., Musmanno, R.: A parallel tensor algorithm for nonlinear optimization. Optim. Methods Softw. 3, 125–142 (1994)

    Article  Google Scholar 

  9. Conforti, D., Mancini, M.: A curvilinear search algorithm for unconstrained optimization by automatic differentiation. Optim. Methods Softw. 15(3–4), 283–297 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  10. Conn, A.R., Gould, N.I.M., Lescrenier, M., Toint, Ph.L.: Performance of a multifrontal scheme for partially separable optimization. In Gomez, S., Hennart, J.P. (eds) Advances in Optimization and Numerical Analysis, Proceedings of the Sixth workshop on Optimization and Numerical Analysis, Oaxaca, Mexico, number 275 in Mathematics and its Applications Series, pp. 79–96. Kluwer Academic Publishers (1994)

  11. Dennis, J.E., Schnabel, R.B.: Numerical methods for unconstrained optimization and nonlinear equations. Prentice-Hall, Englewood Cliffs (1983)

  12. Dixon, L.C.W., Price, R.C.: Numerical experience with the truncated Newton method for unconstrained optimization. J. Optim. Theory Appl. 56(2), 245–255 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  13. Epureanu, B.I., Greenside, H.S.: Fractal basins of attraction associated with a damped Newton’s method. SIAM Rev. 40(1), 102–109 (1998)

  14. Gill, P.E., Murray, W., Wright, M.H.: Practical optimization. Academic Press Inc., Waltham (1981)

  15. Goldfarb, D.: Curvilinear path steplength algorithms for minimization which use directions of negative curvature. Math. Program. 18(1), 31–40 (1980)

    Article  MathSciNet  MATH  Google Scholar 

  16. Grandinetti, L.: Nonlinear optimization by a curvilinear path strategy. In: Thoft-Christensen, P. (ed.) System modelling and optimization. Lecture notes in control and information sciences, vol. 59, pp. 289–298. Springer, Berlin, Heidelberg (1984)

    Chapter  Google Scholar 

  17. Grau-Sánchez, M., Gutiérrez, J.M.: Some variants of the Chebyshev-Halley family of methods with fifth order of convergence. Int. J. Comput. Math. 87(4), 818–833 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  18. Grippo, L., Lampariello, F., Lucidi, S.: A truncated Newton method with nonmonotone line search for unconstrained optimization. J. Optim. Theory Appl. 60(3), 401–419 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  19. Gundersen, G.: Sparsity in higher-order methods for unconstrained optimization. Doctoral Thesis, Department of Informatics, University of Bergen (2008)

  20. Gundersen, G., Steihaug, T.: On large-scale unconstrained optimization problems and higher order methods. Optim. Methods Softw. 25(3), 337–358 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  21. Gutiérrez, J.M., Hernández, M.A.: An acceleration of Newton’s method: Super-Halley method. Appl. Math. Comput. 117(2), 223–239 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  22. Kchouk, B., Dussault, J.-P.: The Chebyshev-Shamanskii method for solving systems of nonlinear equations. J. Optim. Theory Appl. 157(1), 148–167 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  23. Lawrence, P.W., Corless, R.M., Jeffrey, D.J.: Algorithm 917: Complex double-precision evaluation of the Wright \(\omega \) function. ACM Trans. Math. Softw. (TOMS) 38(3), 20 (2012)

    Article  MathSciNet  Google Scholar 

  24. Leon, A.: A comparison among eight known optimizing procedures. In: Lavi, A. Vogl, T.P. (eds) Recent Advances in Optimizations Techniques, pp. 23–42. Wiley, New York (1966)

  25. Lucidi, S., Rochetich, F., Roma, M.: Curvilinear stabilization techniques for truncated Newton methods in large scale unconstrained optimization. SIAM J. Optim. 8(4), 916–939 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  26. McCormick, G.P.: A modification of Armijo’s step-size rule for negative curvature. Math. Program. 13(1), 111–115 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  27. Mertvecova, M.A.: Analogue of the process of tangent hyperbolas for general functional equations (Russian). Doklady Akademii Nauk SSSR 88, 611–614 (1953)

    MathSciNet  MATH  Google Scholar 

  28. Moré, J.J., Garbow, B.S., Hillstrom, K.E.: Testing unconstrained optimization software. ACM Trans. Math. Softw. 7(1), 17–41 (1981)

    Article  MATH  Google Scholar 

  29. Moré, J.J., Sorensen, D.C.: On the use of directions of negative curvature in a modified Newton method. Math. Program. 16(1), 1–20 (1979)

    Article  MATH  Google Scholar 

  30. Nečepurenko, M.I.: On Čebyšev’s method for functional equations (Russian). Uspekhi Matem. Nauk 9(2(60)), 163–170 (1954)

  31. Oren, S.S.: Self-scaling variable metric algorithms without line search for unconstrained minimization. Math. Comput. 27(124), 873–885 (1973)

    Article  MathSciNet  MATH  Google Scholar 

  32. Shen, C., Chen, X., Liang, Y.: A regularized Newton method for degenerate unconstrained optimization problems. Optim. Lett. 6(8), 1913–1933 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  33. Spedicato, E., Huang, Z.: Numerical experience with Newton-like methods for nonlinear algebraic systems. Computing 58(1), 69–89 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  34. Steihaug, T., Suleiman, S.: Rate of convergence of higher order methods. Appl. Numer. Math. 67, 230–242 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  35. Steihaug, T., Suleiman, S.: Curvilinear search and higher order methods for unconstrained optimization. Submitted for possible publication for special issue of Computational Optimization and Applications (2014)

  36. Toint, Ph.L.: Test problems for partially separable optimization and results for the routine PSPMIN. Technical Report 83/4, Department of Mathematics, Faculteés Universitaires de Namur, Namur, Belgium (1983)

  37. Traub, J.F.: Iterative methods for the solution of equations. Prentice-Hall, Englewood Cliffs (1964)

    MATH  Google Scholar 

  38. Zhang, Y., Tapia, R., Velázquez, L.: On convergence of minimization methods: attraction, repulsion, and selection. J. Optim. Theory Appl. 107(3), 529–546 (2000)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sara Suleiman.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Steihaug, T., Suleiman, S. On the final steps of Newton and higher order methods. Optim Lett 10, 401–416 (2016). https://doi.org/10.1007/s11590-015-0899-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-015-0899-y

Keywords

Navigation