Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3512290.3528875acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

On optimal static and dynamic parameter choices for fixed-target optimization

Published: 08 July 2022 Publication History

Abstract

Various flavours of parameter setting, such as (static) parameter tuning and (dynamic) parameter control, receive a lot of attention both in applications and in theoretical investigations. It is widely acknowledged that the choice of parameter values influences the efficiency of evolutionary algorithms (and other search heuristics) a lot, and a considerable amount of work has been dedicated to finding (near-)optimal choices of parameters, or of parameter control strategies.
It is perhaps surprising that all the recent theoretic attempts aim at making smaller the time needed to reach the optimum, whereas in most practical settings we may only hope to reach certain realistic fitness targets. One of the ways to close this gap is to study static and dynamic parameter choices in fixed-target settings, and to understand how these choices are different from those tuned towards reaching the optimum. In this paper we investigate some of these settings, using a mixture of exact theory-driven computations and experimental evaluation, and find few remarkably generic trends, some of which may explain a number of misconceptions found in evolutionary computation.

References

[1]
Aldeida Aleti and Irene Moser. 2016. A systematic literature review of adaptive parameter control methods for evolutionary algorithms. Comput. Surveys 49 (2016), 56:1--56:35.
[2]
Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. 2020. Fast mutation in crossover-based algorithms. In Proceedings of Genetic and Evolutionary Computation Conference. ACM, 1268--1276.
[3]
Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. 2020. First Steps Towards a Runtime Analysis When Starting With a Good Solution. In Parallel Problem Solving from Nature - PPSN XVI. Number 12270 in Lecture Notes in Computer Science. 560--573.
[4]
Denis Antipov, Maxim Buzdalov, and Benjamin Doerr. 2021. Lazy parameter tuning and control: choosing all parameters randomly from a power-law distribution. In Proceedings of the Genetic and Evolutionary Computation Conference. 1115--1123.
[5]
Denis Antipov and Benjamin Doerr. 2020. Runtime Analysis of a Heavy-Tailed (1 + (λ, λ)) Genetic Algorithm on Jump Functions. In Parallel Problem Solving from Nature - PPSN XVI. Number 12270 in Lecture Notes in Computer Science. 545--559.
[6]
Denis Antipov, Benjamin Doerr, and Vitalii Karavaev. 2020. The (1 + (λ, λ)) GA is even faster on multimodal problems. In Proceedings of Genetic and Evolutionary Computation Conference. ACM, 1259--1267.
[7]
Kirill Antonov, Maxim Buzdalov, Arina Buzdalova, and Carola Doerr. 2021. Blending Dynamic Programming with Monte Carlo Simulation for Bounding the Running Time of Evolutionary Algorithms. In Proceedings of Congress on Evolutionary Computation. 878--885.
[8]
Pavel Alexandrovich Borisovsky and Anton Valentinovich Eremeev. 2008. Comparing evolutionary algorithms to the (1+1)-EA. Theoretical Computer Science 403, 1 (2008), 33--41.
[9]
Süntje Böttcher, Benjamin Doerr, and Frank Neumann. 2010. Optimal fixed and adaptive mutation rates for the LeadingOnes problem. In Parallel Problem Solving from Nature - PPSN XI. Number 6238 in Lecture Notes in Computer Science. 1--10.
[10]
Nathan Buskulic and Carola Doerr. 2021. Maximizing Drift Is Not Optimal for Solving OneMax. Evolutionary Computation 29, 4 (2021), 521--541.
[11]
Maxim Buzdalov and Benjamin Doerr. 2017. Runtime analysis of the (1 + (λ, λ)) genetic algorithm on random satisfiable 3-CNF formulas. In Proceedings of Genetic and Evolutionary Computation Conference. 1343--1350.
[12]
Maxim Buzdalov, Benjamin Doerr, Carola Doerr, and Dmitry Vinokurov. 2021. Fixed-target runtime analysis. Algorithmica (2021). Early access.
[13]
Maxim Buzdalov and Carola Doerr. 2020. Optimal Mutation Rates for the (1 + λ) EA on OneMax. In Parallel Problem Solving from Nature - PPSN XVI. Number 12270 in Lecture Notes in Computer Science. 574--587.
[14]
Maxim Buzdalov and Carola Doerr. 2021. Optimal static mutation strength distributions for the (1 + λ) evolutionary algorithm on OneMax. In Proceedings of Genetic and Evolutionary Computation Conference. 660--668.
[15]
Eduardo Carvalho Pinto and Carola Doerr. 2018. A simple proof for the usefulness of crossover in black-box optimization. In Parallel Problem Solving from Nature -PPSN XV, Vol. 2. Number 11102 in Lecture Notes in Computer Science. 29--41.
[16]
Eduardo Carvalho Pinto and Carola Doerr. 2018. Towards a more practice-aware runtime analysis of evolutionary algorithms. https://arxiv.org/abs/1812.00493
[17]
Nguyen Dang and Carola Doerr. 2019. Hyper-Parameter Tuning for the (1 + (λ, λ)) GA. In Proceedings of Genetic and Evolutionary Computation Conference. 889--897.
[18]
Benjamin Doerr and Carola Doerr. 2015. Optimal parameter choices through self-adjustment: Applying the 1/5-th rule in discrete settings. In Proceedings of Genetic and Evolutionary Computation Conference. 1335--1342.
[19]
Benjamin Doerr and Carola Doerr. 2020. Theory of Parameter Control for Discrete Black-Box Optimization: Provable Performance Gains Through Dynamic Parameter Choices. In Theory of Evolutionary Computation: Recent Developments in Discrete Optimization. Springer, 271--321. Also available online at https://arxiv.org/abs/1804.05650v2.
[20]
Benjamin Doerr, Carola Doerr, and Franziska Ebel. 2015. From black-box complexity to designing new genetic algorithms. Theoretical Computer Science 567 (2015), 87--104.
[21]
Benjamin Doerr, Edda Happ, and Christian Klein. 2011. Tight Analysis of the (1+1)-EA for the Single Source Shortest Path Problem. Evolutionary Computation 19, 4 (2011), 673--691.
[22]
Benjamin Doerr, Thomas Jansen, Carsten Witt, and Christine Zarges. 2013. A method to derive fixed budget results from expected optimisation times. In Proceedings of Genetic and Evolutionary Computation Conference. 1581--1588.
[23]
Benjamin Doerr and Frank Neumann (Eds.). 2020. Theory of Evolutionary Computation---Recent Developments in Discrete Optimization. Springer.
[24]
Benjamin Doerr, Frank Neumann, and Andrew M. Sutton. 2015. Improved runtime bounds for the (1+1) EA on random 3-CNF formulas based on fitness-distance correlation. In Proceedings of Genetic and Evolutionary Computation Conference. 1415--1422.
[25]
Carola Doerr and Markus Wagner. 2018. On the effectiveness of simple success-based parameter selection mechanisms for two classical discrete black-box optimization benchmark problems. In Proceedings of Genetic and Evolutionary Computation Conference. ACM, 943--950.
[26]
Carola Doerr, Hao Wang, Furong Ye, Sander van Rijn, and Thomas Bäck. 2018. IOHprofiler: A Benchmarking and Profiling Tool for Iterative Optimization Heuristics. https://arxiv.org/abs/1810.05281 IOHprofiler is available at https://github.com/IOHprofiler.
[27]
Carola Doerr, Furong Ye, Naama Horesh, Hao Wang, Ofer M. Shir, and Thomas Bäck. 2020. Benchmarking discrete optimization heuristics with IOHprofiler. Applied Soft Computing 88 (2020), 106027.
[28]
Stefan Droste, Thomas Jansen, and Ingo Wegener. 2002. On the analysis of the (1+1) evolutionary algorithm. Theoretical Computer Science 276, 1-2 (2002), 51--81.
[29]
Ágoston Endre Eiben, Robert Hinterding, and Zbigniew Michalewicz. 1999. Parameter Control in Evolutionary Algorithms. IEEE Transactions on Evolutionary Computation 3, 2 (1999), 124--141.
[30]
Nikolaus Hansen, Anne Auger, Raymond Ros, Olaf Mersmann, Tea Tusar, and Dimo Brockhoff. 2021. COCO: a platform for comparing continuous optimizers in a black-box setting. Optimization Methods and Software 36, 1 (2021), 114--144.
[31]
Nikolaus Hansen and Andreas Ostermeier. 2001. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation 9 (2001), 159--195.
[32]
Jun He, Thomas Jansen, and Christine Zarges. 2019. Unlimited budget analysis. In Proceedings of Genetic and Evolutionary Computation Conference Companion. 427--428.
[33]
Frank Hutter, Holger H. Hoos, and Kevin Leyton-Brown. 2011. Sequential model-based optimization for general algorithm configuration. In Proceedings of Learning and Intelligent Optimization. Springer, 507--523.
[34]
Frank Hutter, Holger H. Hoos, Kevin Leyton-Brown, and Thomas Stützle. 2009. ParamILS: An automatic algorithm configuration framework. Journal of Artificial Intelligence Research 36 (2009), 267--306.
[35]
Thomas Jansen. 2013. Analyzing Evolutionary Algorithms. Springer.
[36]
Thomas Jansen and Christine Zarges. 2014. Performance analysis of randomised search heuristics operating with a fixed budget. Theoretical Computer Science 545 (2014), 39--58.
[37]
Thomas Jansen and Christine Zarges. 2014. Reevaluating Immune-Inspired Hypermutations Using the Fixed Budget Perspective. IEEE Transactions on Evolutionary Computation 18, 5 (2014), 674--688.
[38]
Giorgos Karafotias, Mark. Hoogendoorn, and Ágoston E. Eiben. 2015. Parameter Control in Evolutionary Algorithms: Trends and Challenges. IEEE Transactions on Evolutionary Computation 19, 2 (2015), 167--187.
[39]
Per Kristian Lehre and Xin Yao. 2007. Runtime analysis of (1+1) EA on computing unique input output sequences. In Proceedings of IEEE Congress on Evolutionary Computation. 1882--1889.
[40]
Manuel López-Ibáñez, Jérémie Dubois-Lacoste, Leslie Pérez Cáceres, Thomas Stützle, and Mauro Birattari. 2016. The irace package: Iterated racing for automatic algorithm configuration. Operations Research Perspectives 3 (2016), 43--58.
[41]
Heinz Mühlenbein. 1992. How genetic algorithms really work: Mutation and hillclimbing. In Parallel Problem Solving from Nature - PPSN II. Elsevier, 15--26.
[42]
Mojgan Pourhassan, Feng Shi, and Frank Neumann. 2019. Parameterized Analysis of Multiobjective Evolutionary Algorithms and the Weighted Vertex Cover Problem. Evolutionary Computation 27, 4 (2019), 559--575.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '22: Proceedings of the Genetic and Evolutionary Computation Conference
July 2022
1472 pages
ISBN:9781450392372
DOI:10.1145/3512290
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 July 2022

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. fixed-target analysis
  2. parameter control
  3. parameter tuning

Qualifiers

  • Research-article

Conference

GECCO '22
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 48
    Total Downloads
  • Downloads (Last 12 months)12
  • Downloads (Last 6 weeks)2
Reflects downloads up to 16 Nov 2024

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media