Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Global convergence and the Powell singular function

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

The Powell singular function was introduced 1962 by M.J.D. Powell as an unconstrained optimization problem. The function is also used as nonlinear least squares problem and system of nonlinear equations. The function is a classic test function included in collections of test problems in optimization as well as an example problem in text books. In the global optimization literature the function is stated as a difficult test case. The function is convex and the Hessian has a double singularity at the solution. In this paper we consider Newton’s method and methods in Halley class and we discuss the relationship between these methods on the Powell Singular Function. We show that these methods have global but linear rate of convergence. The function is in a subclass of unary functions and results for Newton’s method and methods in the Halley class can be extended to this class. Newton’s method is often made globally convergent by introducing a line search. We show that a full Newton step will satisfy many of standard step length rules and that exact line searches will yield slightly faster linear rate of convergence than Newton’s method. We illustrate some of these properties with numerical experiments.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Addis B., Locatelli M.: A new class of test functions for global optimization. J. Glob. Optim. 38(3), 479–501 (2007)

    Article  Google Scholar 

  2. Amat S., Busquier S., Gutiérrez J.M., Hernández M.A.: On the global convergence of Chebyshev’s iterative method. J. Comput. Appl. Math. 220(1–2), 17–21 (2008)

    Article  Google Scholar 

  3. Bongartz I., Conn A.R., Gould N., Toint P.L.: CUTE: Constrained and unconstrained testing environment. ACM Trans. Math. Soft. 21(1), 123–160 (1995)

    Article  Google Scholar 

  4. Brent R.P.: Algorithms for Minimization Without Derivatives. Prentice-Hall, Englewood Cliffs, NJ (1973)

    Google Scholar 

  5. Brent R.P.: Some efficient algorithms for solving systems of nonlinear equations. SIAM J. Numer. Anal. 10(2), 327–344 (1973)

    Article  Google Scholar 

  6. Conn A.R., Gould N.I.M., Toint P.L.: Testing a class of methods for solving minimization problems with simple bounds on the variables. Math. Comput. 50(182), 399–430 (1988)

    Article  Google Scholar 

  7. Csendes T., Ratz D.: Subdivision direction selection in interval methods for global optimization. SIAM J. Numer. Anal. 34(3), 922–938 (1997)

    Article  Google Scholar 

  8. Cuyt A., Cruyssen P.: Abstract Padé-approximants for the solution of a system of nonlinear equations. Comput. Math. Appl. 9(4), 617–624 (1983)

    Article  Google Scholar 

  9. Dennis J.E., Schnabel R.B.: Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Prentice-Hall, Englewood Cliffs, NJ (1983)

    Google Scholar 

  10. Dixon, L.C.W.: Nonlinear optimisation: A survey of the state of the art. In: Evans, D.J. (ed.) Software for Numerical Mathematics: Proceedings of the Loughborough University of Technology Conference of the Institute of Mathematics and Its Applications Held in April 1973, pp. 193–216. Academic Press, Inc., Orlando, USA (1974)

  11. Dixon, L.C.W., Biggs, M.C.: Meander–a Newton based procedure for n-dimensional function minimization. Tech. Rep. 9. The Haffield Polytechnic, Numerical Optimization Center (1970)

  12. Du D.-Z., Pardalos P.M., Wu W.: Mathematical Theory of Optimization. Kluwer, Boston, MA (2001)

    Book  Google Scholar 

  13. Du D.-Z., Pardalos P.M., Wu W.: Rosen’s method, global convergence, and Powell’s conjecture. In: Floudas, C.A., Pardalosx, P.M. (eds.) Encyclopedia of Optimization, pp. 3345–3354. Springer, Berlin (2009)

    Chapter  Google Scholar 

  14. El-Bakry, A., Steihaug, T.: On the component-wise convergence rate. Tech. Rep. TR 98-16, Department of Computational and Applied Mathematics, Rice University (2000)

  15. Fletcher R.: Practical Methods of Optimization. Wiley, New York, NY (1987)

    Google Scholar 

  16. Fletcher R., Freeman T.L.: A modified Newton method for minimization. J. Optim. Theory Appl. 23(3), 357–372 (1977)

    Article  Google Scholar 

  17. Floudas C.A., Pardalos P.M., Adjiman C., Esposito W.R., Gümüs Z.H., Harding S.T., Klepeis J.L., Meyer C.A., Schweiger C.A.: Handbook of Test Problems in Local and Global Optimization. Kluwer, Dordrecht (1999)

    Book  Google Scholar 

  18. Goldfarb D., Wang S.: Partial-update Newton methods for unary, factorable, and partially separable optimization. SIAM J. Optim. 3(2), 382–397 (1993)

    Article  Google Scholar 

  19. Grippo L., Lampariello F., Lucidi S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23(4), 707–716 (1986)

    Article  Google Scholar 

  20. Gundersen G., Steihaug T.: On large-scale unconstrained optimization problems and higher order methods. Optim. Methods Softw. 25(3), 337–358 (2010)

    Article  Google Scholar 

  21. Hirsch M.J., Meneses C.N., Pardalos P.M., Resende M.G.C.: Global optimization by continuous grasp. Optim. Lett. 1(2), 201–212 (2007)

    Article  Google Scholar 

  22. Hirsch M.J., Pardalos P.M., Resende M.G.C.: Solving systems of nonlinear equations with continuous GRASP. Nonlinear Anal. Ser. Real World Appl. 10(4), 2000–2006 (2009)

    Article  Google Scholar 

  23. Hu J., Fu M.C., Marcus S.I.: A model reference adaptive search method for global optimization. Oper. Res. 55(3), 549–568 (2007)

    Article  Google Scholar 

  24. Kearfott R.B.: Some tests of generalized bisection. ACM Trans. Math. Softw. 13(3), 197–220 (1987)

    Article  Google Scholar 

  25. Lukšan L.: Computational experience with improved variable metric methods for unconstrained minimization. Kybernetika 26(5), 415–431 (1990)

    Google Scholar 

  26. McCormick G.P., Sofer A.: Optimization with unary functions. Math. Program. 52(1), 167–178 (1991)

    Article  Google Scholar 

  27. Melman A.: Geometry and convergence of Euler’s and Halley’s methods. SIAM Rev. 39(4), 728–735 (1997)

    Article  Google Scholar 

  28. Moré J.J., Cosnard M.Y.: Numerical solution of nonlinear equations. ACM Trans. Math. Softw. 5(1), 64–85 (1979)

    Article  Google Scholar 

  29. Moré J.J., Garbow B.S., Hillstrom K.E.: Algorithm 566: Fortran subroutines for testing unconstrained optimization software. ACM Trans. Math. Softw. 7(1), 136–140 (1981)

    Article  Google Scholar 

  30. Moré J.J., Garbow B.S., Hillstrom K.E.: Testing unconstrained optimization software. ACM Trans. Math. Softw. 7(1), 17–41 (1981)

    Article  Google Scholar 

  31. Morgan A.P.: A method for computing all solutions to systems of polynomials equations. ACM Trans. Math. Softw. 9(1), 1–17 (1983)

    Article  Google Scholar 

  32. Nelder J.A., Mead R.: A simplex method for function minimization. Comput. J. 7, 308–313 (1965)

    Article  Google Scholar 

  33. Nocedal J., Wright S.J.: Numerical Optimization, 2nd edn. Springer, Berlin (2006)

    Google Scholar 

  34. O’Neill R.: Algorithm AS 47: Function minimization using a simplex procedure. J. R. Stat. Soc. Ser. C (Appl. Stat.) 20(3), 338–345 (1971)

    Google Scholar 

  35. Ostrowski, A.M.: Solution of Equations in Euclidean and Banach Spaces. Third edition of Solution of Equations and Systems of Equations, Pure and Applied Mathematics, vol. 9. Academic Press, New York, NY (1973)

  36. Özdamar L., Demirhan M.: Experiments with new stochastic global optimization search techniques. Comput. Oper. Res. 27(9), 841–865 (2000)

    Article  Google Scholar 

  37. Powell M.J.D.: An iterative method for finding stationary values of a function of several variables. Comput. J. 5, 147–151 (1962)

    Article  Google Scholar 

  38. Rao S.S.: Engineering Optimization: Theory and Practice. Wiley, New York, NY (1996)

    Google Scholar 

  39. Roose A., Kulla V., Lomp M., Meressoo T.: Test Examples of Systems of Nonlinear Equations. Estonian Software and Computer Service Company, Tallinn (1990)

    Google Scholar 

  40. Rosenbrock H.H., Storey C.: Computational Techniques for Chemical Engineers. Pergamon Press, New York, NY (1966)

    Google Scholar 

  41. Schittkowski K.: More Test Examples for Nonlinear Programming Codes. Lecture Notes in Economics and Mathematical Systems vol. 282. Springer, New York, NY (1987)

    Book  Google Scholar 

  42. Schwefel H.-P.: Numerical Optimization of Computer Models. Wiley, New York, NY (1981)

    Google Scholar 

  43. Spedicato, E.: Computational experience with quasi-Newton algorithms for minimization problems of moderately large size. In: Dixon, L.C.W., Szegö, G.P. (eds.) Towards Global Optimisation 2, pp. 209–219. North-Holland Publishing Company, New York, USA (1978)

  44. Steihaug, T., Suleiman, S.: Notes on invariance of higher order methods under linear transformation. Tech. Rep. (under preparation), Department of Informatics, University of Bergen (2012)

  45. Stewart G.W.: A modification of Davidon’s minimization method to accept difference approximations of derivatives. J. Assoc. Comput. Mach. 14(1), 72–83 (1967)

    Article  Google Scholar 

  46. Sun W., Yuan Y.: Optimization Theory and Methods: Nonlinear Programming. Springer Optimization and Its Applications. Springer, Berlin (2006)

    Google Scholar 

  47. Thorlund-Petersen L.: Global convergence of Newton’s method on an interval. Math. Methods Oper. Res. 59, 91–110 (2004)

    Article  Google Scholar 

  48. Walster, G.W., Hansen, E.R., Sengupta, S.: Test results for a global optimization algorithm. In: Proceedings of the SIAM Conference on Numerical Optimization, Boulder, Colorado June 12–14, 1984, pp. 272–287. SIAM (1985)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sara Suleiman.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Steihaug, T., Suleiman, S. Global convergence and the Powell singular function. J Glob Optim 56, 845–853 (2013). https://doi.org/10.1007/s10898-012-9898-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-012-9898-z

Keywords

Mathematics Subject Classification

Navigation