Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3450218.3477307acmconferencesArticle/Chapter ViewAbstractPublication PagesfogaConference Proceedingsconference-collections
research-article
Open access

Non-local optimization: imposing structure on optimization problems by relaxation

Published: 06 September 2021 Publication History

Abstract

In stochastic optimization, particularly in evolutionary computation and reinforcement learning, the optimization of a function f : Ω → R is often addressed through optimizing a so-called relaxation θ ϵ Θ → Eθ(f) of f, where Θ resembles the parameters of a family of probability measures on Ω. We investigate the structure of such relaxations by means of measure theory and Fourier analysis, enabling us to shed light on the success of many associated stochastic optimization methods. The main structural traits we derive and that allow fast and reliable optimization of relaxations are the consistency of optimal values of f, Lipschitzness of gradients, and convexity. We emphasize settings where f itself is not differentiable or convex, e.g., in the presence of (stochastic) disturbance.

References

[1]
Hans-Georg Beyer. 2014. Convergence Analysis of Evolutionary Algorithms That Are Based on the Paradigm of Information Geometry. Evolutionary Computation 22, 4 (2014), 679--709.
[2]
Sébastien Bubeck. 2015. Convex Optimization: Algorithms and Complexity. Found. Trends Mach. Learn. 8, 3-4 (2015), 231--357.
[3]
Krzysztof Choromanski, Aldo Pacchiano, Jack Parker-Holder, Yunhao Tang, and Vikas Sindhwani. 2019. From Complexity to Simplicity: Adaptive ES-Active Sub-spaces for Blackbox Optimization. Thirty-third Conference on Neural Information Processing Systems (NeurIPS 2019) (2019).
[4]
Marco Loog, Johannes Jisse Duistermaat, and Luc M. J. Florack. 2001. On the Behavior of Spatial Critical Points under Gaussian Blurring A Folklore Theorem and Scale-Space Constraints. In Scale-Space and Morphology in Computer Vision. 183--192.
[5]
I. Loshchilov, T. Glasmachers, and H. Beyer. 2019. Large Scale Black-Box Optimization by Limited-Memory Matrix Adaptation. IEEE Transactions on Evolutionary Computation 23, 2 (2019), 353--358.
[6]
Alvaro Maggiar, Andreas Wächter, Irina S. Dolinskaya, and Jeremy Staum. 2018. A Derivative-Free Trust-Region Algorithm for the Optimization of Functions Smoothed via Gaussian Convolution Using Adaptive Multiple Importance Sampling. SIAM Journal on Optimization 28, 2 (2018), 1478--1507.
[7]
Luigi Malago, Matteo Matteucci, and Pistone Giovanni. 2009. Stochastic Relaxation as a Unifying Approach in 0/1 Programming. In NIPS 2009 Workshop on Discrete Optimization in Machine Learning: Submodularity, Sparsity & Polyhedra (DISCML 2009). 1--7.
[8]
Hossein Mobahi and John W Fisher III. 2015. On the link between gaussian homotopy continuation and convex envelopes. In International Workshop on Energy Minimization Methods in Computer Vision and Pattern Recognition. 43--56.
[9]
Hossein Mobahi and John W Fisher III. 2015. A theoretical analysis of optimization by gaussian continuation. In Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence. 1205--1211.
[10]
Hossein Mobahi and Yi Ma. 2012. Gaussian smoothing and asymptotic convexity. Coordinated Science Laboratory Report no. UILU-ENG-12-2201, DC-254 (2012).
[11]
Yurii Nesterov and Vladimir Spokoiny. 2017. Random gradient-free minimization of convex functions. Foundations of Computational Mathematics 17, 2 (2017), 527--566.
[12]
Yann Ollivier, Ludovic Arnold, Anne Auger, and Nikolaus Hansen. 2017. Information-geometric optimization algorithms: A unifying picture via invariance principles. The Journal of Machine Learning Research 18, 1 (2017), 564--628.
[13]
Tim Salimans, Jonathan Ho, Xi Chen, Szymon Sidor, and Ilya Sutskever. 2017. Evolution strategies as a scalable alternative to reinforcement learning. arXiv preprint arXiv:1703.03864 (2017).
[14]
René L. Schilling. 2005. Measures, Integrals and Martingales. Cambridge University Press.
[15]
D. Wierstra, T. Schaul, J. Peters, and J. Schmidhuber. 2008. Natural Evolution Strategies. In 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence). 3381--3387.
[16]
Jingzhao Zhang, Hongyi Zhang, and Suvrit Sra. 2018. R-SPIDER: A Fast Riemannian Stochastic Optimization Algorithm with Curvature Independent Rate. arXiv preprint arXiv:1811.04194 (2018).

Cited By

View all
  • (2023)On a Population Sizing Model for Evolution Strategies Optimizing the Highly Multimodal Rastrigin FunctionProceedings of the Genetic and Evolutionary Computation Conference10.1145/3583131.3590451(848-855)Online publication date: 15-Jul-2023

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
FOGA '21: Proceedings of the 16th ACM/SIGEVO Conference on Foundations of Genetic Algorithms
September 2021
130 pages
ISBN:9781450383523
DOI:10.1145/3450218
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 06 September 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. evolution strategies
  2. global optimization
  3. robust optimization
  4. stochastic optimization

Qualifiers

  • Research-article

Conference

FOGA '21
Sponsor:
FOGA '21: Foundations of Genetic Algorithms XVI
September 6 - 8, 2021
Virtual Event, Austria

Acceptance Rates

FOGA '21 Paper Acceptance Rate 10 of 21 submissions, 48%;
Overall Acceptance Rate 72 of 131 submissions, 55%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)104
  • Downloads (Last 6 weeks)18
Reflects downloads up to 16 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2023)On a Population Sizing Model for Evolution Strategies Optimizing the Highly Multimodal Rastrigin FunctionProceedings of the Genetic and Evolutionary Computation Conference10.1145/3583131.3590451(848-855)Online publication date: 15-Jul-2023

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media