Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

A performance analysis of Basin hopping compared to established metaheuristics for global optimization

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

During the last decades many metaheuristics for global numerical optimization have been proposed. Among them, Basin Hopping is very simple and straightforward to implement, although rarely used outside its original Physical Chemistry community. In this work, our aim is to compare Basin Hopping, and two population variants of it, with readily available implementations of the well known metaheuristics Differential Evolution, Particle Swarm Optimization, and Covariance Matrix Adaptation Evolution Strategy. We perform numerical experiments using the IOH profiler environment with the BBOB test function set and two difficult real-world problems. The experiments were carried out in two different but complementary ways: by measuring the performance under a fixed budget of function evaluations and by considering a fixed target value. The general conclusion is that Basin Hopping and its newly introduced population variant are almost as good as Covariance Matrix Adaptation on the synthetic benchmark functions and better than it on the two hard cluster energy minimization problems. Thus, the proposed analyses show that Basin Hopping can be considered a good candidate for global numerical optimization problems along with the more established metaheuristics, especially if one wants to obtain quick and reliable results on an unknown problem.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Algorithm 1
Algorithm 2
Algorithm 3
Algorithm 4
Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Data availibility

The datasets generated and analysed during the current study are available from the corresponding author on reasonable request.

Notes

  1. As described in Sect. 3.2, the considered search space domain is \([-5,+5]^D\), therefore the perturbation domain of BH, BHPOP and PBH is \([-0.5,+0.5]^D\).

  2. Critical difference diagrams have been introduced in [37] and then adapted to generic pairwise comparison tests in [38].

References

  1. Audet, C., Hare, W.: Derivative-free and Blackbox Optimization. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-68913-5

    Book  Google Scholar 

  2. Bäck, T., Foussette, C., Krause, P.: Contemporary Evolution strategies. Springer, Berlin, Heidelberg (2013)

    Book  Google Scholar 

  3. Storn, R., Price, K.: Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces. J. Global Optim. 11(4), 341–359 (1997). https://doi.org/10.1023/A:1008202821328

    Article  MathSciNet  Google Scholar 

  4. Kirkpatrick, S., Gelatt, C.D., Vecchi, M.P.: Optimization by simulated annealing. Science 220(4598), 671–680 (1983). https://doi.org/10.1126/science.220.4598.671

    Article  MathSciNet  Google Scholar 

  5. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of ICNN’95-International Conference on Neural Networks, vol. 4, pp. 1942–1948 (1995). https://doi.org/10.1109/ICNN.1995.488968 . IEEE

  6. Liberti, L.: Introduction to global optimization. Ecole Polytechnique, France (2008)

    Google Scholar 

  7. Locatelli, M., Schoen, F.: Global Optimization: Theory, Algorithms, and Applications. Society for Industrial and Applied Mathematics, Philadelphia, USA (2013)

    Book  Google Scholar 

  8. Kochenderfer, M.J., Wheeler, T.A.: Algorithms for Optimization. MIT Press, Cambridge, Massachusetts (2019)

    Google Scholar 

  9. Sergeyev, Y.D., Kvasov, D.E., Mukhametzhanov, M.S.: On the efficiency of nature-inspired metaheuristics in expensive global optimization with limited budget. Sci. Rep. 8(1), 1–9 (2018). https://doi.org/10.1038/s41598-017-18940-4

    Article  Google Scholar 

  10. Pham, D.T., Castellani, M.: Benchmarking and comparison of nature-inspired population-based continuous optimisation algorithms. Soft. Comput. 18(5), 871–903 (2014). https://doi.org/10.1007/s00500-013-1104-9

    Article  Google Scholar 

  11. Wales, D.J., Doye, J.P.: Global optimization by basin-hopping and the lowest energy structures of Lennard-Jones clusters containing up to 110 atoms. J. Phys. Chem. A 101(28), 5111–5116 (1997). https://doi.org/10.1021/jp970984n

    Article  Google Scholar 

  12. Wales, D.J., Scheraga, H.A.: Global optimization of clusters, crystals, and biomolecules. Science 285(5432), 1368–1372 (1999). https://doi.org/10.1126/science.285.5432.1368

    Article  Google Scholar 

  13. Doye, J.P., Leary, R.H., Locatelli, M., Schoen, F.: Global optimization of Morse clusters by potential energy transformations. INFORMS J. Comput. 16(4), 371–379 (2004). https://doi.org/10.1287/ijoc.1040.0084

    Article  Google Scholar 

  14. Kucharik, M., Hofacker, I.L., Stadler, P.F., Qin, J.: Basin hopping graph: a computational framework to characterize RNA folding landscapes. Bioinformatics 30(14), 2009–2017 (2014). https://doi.org/10.1093/bioinformatics/btu156

    Article  Google Scholar 

  15. Zhou, C., Ieritano, C., Hopkins, W.S.: Augmenting basin-hopping with techniques from unsupervised machine learning: Applications in spectroscopy and ion mobility. Front. Chem. 7, 519 (2019). https://doi.org/10.3389/fchem.2019.00519

    Article  Google Scholar 

  16. Banerjee, A., Jasrasaria, D., Niblett, S.P., Wales, D.J.: Crystal structure prediction for benzene using basin-hopping global optimization. J. Phys. Chem. A 125(17), 3776–3784 (2021). https://doi.org/10.1021/acs.jpca.1c00903

    Article  Google Scholar 

  17. Baioletti, M., Milani, A., Santucci, V., Tomassini, M.: Comparing basin hopping with differential evolution and particle swarm optimization. In: International Conference on the Applications of Evolutionary Computation (Part of EvoStar), pp. 46–60 (2022). https://doi.org/10.1007/978-3-031-02462-7_4 . Springer

  18. Hansen, N., Finck, S., Ros, R., Auger, A.: Real-parameter black-box optimization benchmarking 2009: Noiseless functions definitions. PhD thesis, INRIA (2009)

  19. Doerr, C., Wang, H., Ye, F., Rijn, S., Bäck, T.: IOHprofiler: A benchmarking and profiling tool for iterative optimization heuristics. arXiv preprint arXiv:1810.05281 (2018) https://doi.org/10.48550/arXiv.1810.05281

  20. Baudis, P.: COCOpf: An algorithm portfolio framework. arXiv preprint arXiv:1405.3487 (2014) https://doi.org/10.48550/arXiv.1405.3487

  21. Grosso, A., Locatelli, M., Schoen, F.: An experimental analysis of a population based approach for global optimization. Comput. Optim. Appl. 38(3), 351–370 (2007). https://doi.org/10.1007/s10589-007-9026-z

    Article  MathSciNet  Google Scholar 

  22. Grosso, A., Locatelli, M., Schoen, F.: A population-based approach for hard global optimization problems based on dissimilarity measures. Math. Program. 110(2), 373–404 (2007). https://doi.org/10.1007/s10107-006-0006-3

    Article  MathSciNet  Google Scholar 

  23. Leary, R.H.: Global optimization on funneling landscapes. J. Global Optim. 18(4), 367–383 (2000). https://doi.org/10.1023/A:1026500301312

    Article  MathSciNet  Google Scholar 

  24. Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45(1–3), 503–528 (1989). https://doi.org/10.1007/BF01589116

    Article  MathSciNet  Google Scholar 

  25. Nelder, J.A., Mead, R.: A simplex method for function minimization. Comput. J. 7(4), 308–313 (1965). https://doi.org/10.1093/comjnl/7.4.308

    Article  MathSciNet  Google Scholar 

  26. Powell, M.J.: An efficient method for finding the minimum of a function of several variables without calculating derivatives. Comput. J. 7(2), 155–162 (1964). https://doi.org/10.1093/comjnl/7.2.155

    Article  MathSciNet  Google Scholar 

  27. Lourenço, H.R., Martin, O.C., Stützle, T.: Iterated local search: Framework and applications. In: Handbook of Metaheuristics, pp. 129–168. Springer, Boston, MA (2019). https://doi.org/10.1007/978-3-319-91086-4_5

  28. Hansen, N., Ostermeier, A.: Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation. In: Proceedings of IEEE International Conference on Evolutionary Computation, pp. 312–317 (1996). https://doi.org/10.1109/ICEC.1996.542381 . IEEE

  29. Rapin, J., Teytaud, O.: Nevergrad - A gradient-free optimization platform. GitHub, Califirnia (2018)

    Google Scholar 

  30. ...Virtanen, P., Gommers, R., Oliphant, T.E., Haberland, M., Reddy, T., Cournapeau, D., Burovski, E., Peterson, P., Weckesser, W., Bright, J., van der Walt, S.J., Brett, M., Wilson, J., Millman, K.J., Mayorov, N., Nelson, A.R.J., Jones, E., Kern, R., Larson, E., Carey, C.J., Polat, İ, Feng, Y., Moore, E.W., Vander Plas, J., Laxalde, D., Perktold, J., Cimrman, R., Henriksen, I., Quintero, E.A., Harris, C.R., Archibald, A.M., Ribeiro, A.H., Pedregosa, F., van Mulbregt, P.: SciPy 1.0 contributors: SciPy 10: fundamental algorithms for scientific computing in python. Nat. Methods 17, 261–272 (2020). https://doi.org/10.1038/s41592-019-0686-2

    Article  Google Scholar 

  31. Faure, H., Pillichshammer, F., Pirsic, G., Schmid, W.C.: L2 discrepancy of generalized two-dimensional hammersley point sets scrambled with arbitrary permutations. Acta Arith 4(141), 395–418 (2010). https://doi.org/10.4064/aa141-4-6

    Article  Google Scholar 

  32. Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997). https://doi.org/10.1109/4235.585893

    Article  Google Scholar 

  33. Alabert, A., Berti, A., Caballero, R., Ferrante, M.: No-free-lunch theorems in the continuum. Theoret. Comput. Sci. 600, 98–106 (2015). https://doi.org/10.1016/j.tcs.2015.07.029

    Article  MathSciNet  Google Scholar 

  34. Derrac, J., García, S., Molina, D., Herrera, F.: A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evolut. Comput. 1(1), 3–18 (2011). https://doi.org/10.1016/j.swevo.2011.02.002

    Article  Google Scholar 

  35. Hollander, M., Wolfe, D.A., Chicken, E.: Nonparametric Statistical Methods vol. 751. John Wiley & Sons, Ltd, Hoboken, New Jersey, USA (2013). https://doi.org/10.1002/9781119196037

  36. Wang, H., Vermetten, D., Ye, F., Doerr, C., Bäck, T.: IOHanalyzer: detailed performance analyses for iterative optimization heuristics. ACM Trans. Evolut. Learn. Optim. 2(1), 1–29 (2022). https://doi.org/10.1145/3510426

    Article  Google Scholar 

  37. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006). https://doi.org/10.5555/1248547.1248548

    Article  MathSciNet  Google Scholar 

  38. Ismail Fawaz, H., Forestier, G., Weber, J., Idoumghar, L., Muller, P.-A.: Deep learning for time series classification: a review. Data Min. Knowl. Disc. 33(4), 917–963 (2019). https://doi.org/10.1007/s10618-019-00619-1

    Article  MathSciNet  Google Scholar 

  39. Das, S., N.Suganthan, P.: Problem definitions and evaluation criteria for cec 2011 competition on testing evolutionary algorithms on real world optimization problems. Jadavpur University, Nanyang Technological University, Kolkata, 341–359 (2010)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marco Baioletti.

Ethics declarations

Conflict of interest

The authors have no financial or proprietary interests in any material discussed in this article.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Baioletti, M., Santucci, V. & Tomassini, M. A performance analysis of Basin hopping compared to established metaheuristics for global optimization. J Glob Optim 89, 803–832 (2024). https://doi.org/10.1007/s10898-024-01373-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-024-01373-5

Keywords

Navigation