Nothing Special   »   [go: up one dir, main page]

Skip to main content

Advertisement

Log in

Linear equalities in blackbox optimization

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

The mesh adaptive direct search (Mads) algorithm is designed for blackbox optimization problems subject to general inequality constraints. Currently, Mads does not support equalities, neither in theory nor in practice. The present work proposes extensions to treat problems with linear equalities whose expression is known. The main idea consists in reformulating the optimization problem into an equivalent problem without equalities and possibly fewer optimization variables. Several such reformulations are proposed, involving orthogonal projections, QR or SVD decompositions, as well as simplex decompositions into basic and nonbasic variables. All of these strategies are studied within a unified convergence analysis, guaranteeing Clarke stationarity under mild conditions provided by a new result on the hypertangent cone. Numerical results on a subset of the CUTEst collection are reported.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Abramson, M.A., Audet, C., Couture, G., Dennis, Jr. J.E., Le Digabel, S., Tribes, C.: The NOMAD project. Software available at https://www.gerad.ca/nomad (2014)

  2. Abramson, M.A., Audet, C., Dennis Jr, J.E., Le Digabel, S.: OrthoMADS: a deterministic MADS instance with orthogonal directions. SIAM J. Optim. 20(2), 948–966 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  3. Abramson, M.A., Brezhneva, O.A., Dennis Jr, J.E., Pingel, R.L.: Pattern search in the presence of degenerate linear constraints. Optim. Methods Softw. 23(3), 297–319 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  4. Audet, C.: A survey on direct search methods for blackbox optimization and their applications. In: Pardalos, P.M., Rassias, T.M. (eds.) Mathematics without Boundaries: Surveys in Interdisciplinary Research, Chapter 2, pp. 31–56. Springer, New York (2014)

    Chapter  Google Scholar 

  5. Audet, C., Dennis Jr, J.E.: Mesh adaptive direct search algorithms for constrained optimization. SIAM J. Optim. 17(1), 188–217 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  6. Audet, C., Dennis Jr, J.E.: A progressive barrier for derivative-free nonlinear programming. SIAM J. Optim. 20(1), 445–472 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  7. Audet, C., Ianni, A., Le Digabel, S., Tribes, C.: Reducing the number of function evaluations in mesh adaptive direct search algorithms. SIAM J. Optim. 24(2), 621–642 (2014)

    Article  MATH  MathSciNet  Google Scholar 

  8. Bueno, L.F., Friedlander, A., Martínez, J.M., Sobral, F.N.C.: Inexact restoration method for derivative-free optimization with smooth constraints. SIAM J. Optim. 23(2), 1189–1213 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  9. Çivril, A., Magdon-Ismail, M.: On selecting a maximum volume sub-matrix of a matrix and related problems. Theor. Comput. Sci. 410(47–49), 4801–4811 (2009)

    Article  MATH  Google Scholar 

  10. Clarke, F.H.: Optimization and Nonsmooth Analysis. Wiley, New York (1983). Reissued in 1990 by SIAM Publications, Philadelphia, as, Vol. 5 in the series Classics in Applied Mathematics.

  11. Conn, A.R., Gould, N.I.M., Toint, PhL: A globally convergent augmented Lagrangian algorithm for optimization with general constraints and simple bounds. SIAM J. Numer. Anal. 28(2), 545–572 (1991)

    Article  MATH  MathSciNet  Google Scholar 

  12. Conn, A.R., Scheinberg, K., Toint, P.L.: A derivative free optimization algorithm in practice. In: Proceedings the of 7th AIAA/USAF/NASA/ISSMO Symposium on Multidisciplinary Analysis and Optimization, St. Louis (1998)

  13. Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to Derivative-Free Optimization. MOS/SIAM Series on Optimization. SIAM, Philadelphia (2009)

    Google Scholar 

  14. Coope, I.D., Price, C.J.: On the convergence of grid-based methods for unconstrained optimization. SIAM J. Optim. 11(4), 859–869 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  15. Dreisigmeyer, D.W.: Equality constraints, Riemannian manifolds and direct search methods. Technical Report LA-UR-06-7406, Los Alamos National Laboratory, Los Alamos (2006)

  16. Gould, N.I.M., Orban, D., Toint, P.L.: CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization. Computational Optimization and Applications. Code available at http://ccpforge.cse.rl.ac.uk/gf/project/cutest/wiki (2014)

  17. Gould, N.I.M., Toint, PhL: Nonlinear programming without a penalty function or a filter. Math. Program. 122(1), 155–196 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  18. Griffin, J.D., Kolda, T.G., Lewis, R.M.: Asynchronous parallel generating set search for linearly-constrained optimization. SIAM J. Sci. Comput. 30(4), 1892–1924 (2008)

    Article  MathSciNet  Google Scholar 

  19. Hock, W., Schittkowski, K.: Test Examples for Nonlinear Programming Codes. Lecture Notes in Economics and Mathematical Systems, vol. 187. Springer, Berlin (1981)

    Book  Google Scholar 

  20. Ivanov, B.B., Galushko, A.A., Stateva, R.P.: Phase stability analysis with equations of state—a fresh look from a different perspective. Ind. Eng. Chem. Res. 52(32), 11208–11223 (2013)

    Article  Google Scholar 

  21. Jahn, J.: Introduction to the Theory of Nonlinear Optimization. Springer, Berlin (1994)

    Book  MATH  Google Scholar 

  22. Kolda, T.G., Lewis, R.M., Torczon, V.: A generating set direct search augmented Lagrangian algorithm for optimization with a combination of general and linear constraints. Technical Report SAND2006-5315, Sandia National Laboratories (2006)

  23. Kolda, T.G., Lewis, R.M., Torczon, V.: Stationarity results for generating set search for linearly constrained optimization. SIAM J. Optim. 17(4), 943–968 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  24. Lawson, C.L., Hanson, R.J.: Solving Least Squares Problems. Prentice-Hall, Englewood Cliffs (1974)

    MATH  Google Scholar 

  25. Le Digabel, S.: Algorithm 909: NOMAD: Nonlinear optimization with the MADS algorithm. ACM Trans. Math. Softw. 37(4), 44:1–44:15 (2011)

    Article  Google Scholar 

  26. Lewis, R.M., Shepherd, A., Torczon, V.: Implementing generating set search methods for linearly constrained minimization. SIAM J. Sci. Comput. 29(6), 2507–2530 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  27. Lewis, R.M., Torczon, V.: Pattern search methods for linearly constrained minimization. SIAM J. Optim. 10(3), 917–941 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  28. Lewis, R.M., Torczon, V.: Active set identification for linearly constrained minimization without explicit derivatives. SIAM J. Optim. 20(3), 1378–1405 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  29. Lewis, R.M., Torczon, V.: A direct search approach to nonlinear programming problems using an augmented lagrangian method with explicit treatment of linear constraints. Technical report, College of William & Mary (2010)

  30. Liu, L., Zhang, X.: Generalized pattern search methods for linearly equality constrained optimization problems. Appl. Math. Comput. 181(1), 527–535 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  31. Martínez, J.M., Sobral, F.N.C.: Constrained derivative-free optimization on thin domains. J. Glob. Optim. 56(3), 1217–1232 (2013)

    Article  MATH  Google Scholar 

  32. May, J.H.: Linearly constrained nonlinear programming: a solution method that does not require analytic derivatives. Ph.D thesis, Yale University (1974)

  33. Mifflin, R.: A superlinearly convergent algorithm for minimization without evaluating derivatives. Math. Program. 9(1), 100–117 (1975)

    Article  MATH  MathSciNet  Google Scholar 

  34. Moré, J.J., Wild, S.M.: Benchmarking derivative-free optimization algorithms. SIAM J. Optim. 20(1), 172–191 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  35. Nocedal, J., Wright, S.J.: Numerical Optimization Springer Series in Operations Research. Springer, New York (1999)

    Google Scholar 

  36. Plantenga, T.D.: HOPSPACK 2.0 user manual. Technical Report SAND2009-6265, Sandia National Laboratories, Livermore (2009)

  37. Powell, M.J.D.: Lincoa software. Software available at http://mat.uc.pt/~zhang/software.html#lincoa

  38. Powell, M.J.D.: On fast trust region methods for quadratic models with linear constraints. Technical Report DAMTP 2014/NA02, Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Silver Street, Cambridge CB3 9EW (2014)

  39. Price, C.J., Coope, I.D.: Frames and grids in unconstrained and linearly constrained optimization: a nonsmooth approach. SIAM J. Optim. 14(2), 415–438 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  40. Rudin, W.: Functional Analysis International Series in Pure and Applied Mathematics, 2nd edn. McGraw-Hill Inc., New York (1991)

    Google Scholar 

  41. Sampaio, P.R., Toint, P.L.: A derivative-free trust-funnel method for equality-constrained nonlinear optimization. Technical report, NAXYS-Namur Center for Complex Systems, Belgium(2014)

  42. Selvan, S.E., Borckmans, P.B., Chattopadhyay, A., Absil, P.-A.: Spherical mesh adaptive direct search for separating quasi-uncorrelated sources by range-based independent component analysis. Neural Comput. 25(9), 2486–2522 (2013)

    Article  MathSciNet  Google Scholar 

  43. Torczon, V.: On the convergence of pattern search algorithms. SIAM J. Optim. 7(1), 1–25 (1997)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Acknowledgments

The authors would like to thank two anonymous referees for their careful reading and helpful comments and suggestions. Work of the first author was supported by NSERC Grant 239436. The second author was supported by NSERC Grant 418250. The first and second authors are supported by AFOSR FA9550-12-1-0198.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mathilde Peyrega.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Audet, C., Le Digabel, S. & Peyrega, M. Linear equalities in blackbox optimization. Comput Optim Appl 61, 1–23 (2015). https://doi.org/10.1007/s10589-014-9708-2

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-014-9708-2

Keywords

Navigation