Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/1389095.1389188acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Functionally specialized CMA-ES: a modification of CMA-ES based on the specialization of the functions of covariance matrix adaptation and step size adaptation

Published: 12 July 2008 Publication History

Abstract

This paper aims the design of efficient and effective optimization algorithms for function optimization. This paper presents a new framework of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Recent studies modified the CMA-ES from the viewpoint of covariance matrix adaptation and resulted in drastic reduction of the number of generations. In addition to their modification, this paper modifies the CMA-ES from the viewpoint of step size adaptation. The main idea of modification is semantically specializing functions of covariance matrix adaptation and step size adaptation. This new method is evaluated on 8 classical unimodal and multimodal test functions and the performance is compared with standard CMA-ES. The experimental result demonstrates an improvement of the search performances in particular with large populations. This result is mainly because the proposed Hybrid-SSA instead of the existing CSA can adjust the global step length more appropriately under large populations and function specialization helps appropriate adaptation of the overall variance of the mutation distribution.

References

[1]
A. Auger and N. Hansen. Performance evaluation of an advanced local search evolutionary algorithm. In Proceedings of the IEEE Congress on Evolutionary Computation, CEC 2005, pages 1777--1784, 2005.]]
[2]
A. Auger and N. Hansen. A restart cma evolution strategy with increasing population size. In Proceedings of the IEEE Congress on Evolutionary Computation, CEC 2005, pages 1768--1776, 2005.]]
[3]
N. Hansen. Invariance, self--adaptation and correlated mutations in evolution strategies. In Sixth International Conference on Parallel Problem Solving from Nature PPSN VI, Proceedings, pages 355--364, Berlin, 2000. Springer.]]
[4]
N. Hansen. The CMA evolution strategy: a comparing review. In J. Lozano, P. Larranaga, I. Inza, and E. Bengoetxea, editors, Towards a new evolutionary computation. Advances on estimation of distribution algorithms, pages 75--102. Springer, 2006.]]
[5]
N. Hansen and S. Kern. Evaluating the cma evolution strategy on multimodal test functions. In Eighth International Conference on Parallel Problem Solving from Nature PPSN VIII, Proceedings, pages 282--291, Berlin, 2004. Springer.]]
[6]
N. Hansen, S. Müller, and P. Koumoutsakos. Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evolutionary Computation, 11(1):1--18, 2003.]]
[7]
N. Hansen and A. Ostermeier. Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation. In Proceedings of the IEEE Congress on Evolutionary Computation, CEC 1996, pages 312--317, 1996.]]
[8]
N. Hansen and A. Ostermeier. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation, 9(2):159--195, 2001.]]
[9]
G. A. Jastrebski and D. V. Arnold. Improving evolution strategies through active covariance matrix adaptation. In Proceedings of the IEEE Congress on Evolutionary Computation, CEC 2006, pages 9719--9726, 2006.]]
[10]
S. Kern, S. Müller, N. Hansen, D. Büche, J. Ocenasek, and P. Koumoutsakos. Learning probability distributions in continuous evolutionary algorithms--a comparative review. Natural Computing, 3(1):77--112, 2004.]]
[11]
S. D. Müller, N. Hansen, and P. Koumoutsakos. Increasing the serial and the parallel performance of the cma-evolution strategy with large populations. In Seventh International Conference on Parallel Problem Solving from Nature PPSN VII, Proceedings, pages 422--431, Berlin, 2002. Springer.]]
[12]
A. Ostermeier, A. Gawelczyk, and N. Hansen. Step-size adaptation based on non-local use of selection information. In Springer, editor, Eighth International Conference on Parallel Problem Solving from Nature PPSN VIII, Proceedings, pages 189--198, Jerusalem, 1994.]]

Cited By

View all

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
GECCO '08: Proceedings of the 10th annual conference on Genetic and evolutionary computation
July 2008
1814 pages
ISBN:9781605581309
DOI:10.1145/1389095
  • Conference Chair:
  • Conor Ryan,
  • Editor:
  • Maarten Keijzer
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 12 July 2008

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. covariance matrix adaptation
  2. evolution strategy
  3. functional specialization
  4. step size adaptation

Qualifiers

  • Research-article

Conference

GECCO08
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)8
  • Downloads (Last 6 weeks)1
Reflects downloads up to 13 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Surrogate-Assisted $$(1+1)$$-CMA-ES with Switching Mechanism of Utility FunctionsApplications of Evolutionary Computation10.1007/978-3-031-30229-9_51(798-814)Online publication date: 9-Apr-2023
  • (2020)Diagonal Acceleration for Covariance Matrix Adaptation Evolution StrategiesEvolutionary Computation10.1162/evco_a_0026028:3(405-435)Online publication date: 1-Sep-2020
  • (2018)Evolution StrategiesHandbook of Heuristics10.1007/978-3-319-07153-4_13-1(1-31)Online publication date: 30-Jan-2018
  • (2018)Evolution StrategiesHandbook of Heuristics10.1007/978-3-319-07124-4_13(89-119)Online publication date: 14-Aug-2018
  • (2017)Deriving and improving CMA-ES with information geometric trust regionsProceedings of the Genetic and Evolutionary Computation Conference10.1145/3071178.3071252(657-664)Online publication date: 1-Jul-2017
  • (2015)Sample Reuse in the Covariance Matrix Adaptation Evolution Strategy Based on Importance SamplingProceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation10.1145/2739480.2754704(305-312)Online publication date: 11-Jul-2015
  • (2012)Eigenspace sampling in the mirrored variant of (1, λ)-CMA-ES2012 IEEE Congress on Evolutionary Computation10.1109/CEC.2012.6256650(1-8)Online publication date: Jun-2012
  • (2012)Theoretical Foundation for CMA-ES from Information Geometry PerspectiveAlgorithmica10.1007/s00453-011-9564-864:4(698-716)Online publication date: 1-Dec-2012
  • (2011)AGLSDC: A Genetic Local Search Suitable for Parallel ComputationSICE Journal of Control, Measurement, and System Integration10.9746/jcmsi.4.1054:2(105-113)Online publication date: 2011
  • (2011)Proposal of distance-weighted exponential natural evolution strategies2011 IEEE Congress of Evolutionary Computation (CEC)10.1109/CEC.2011.5949614(164-171)Online publication date: Jun-2011
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media