Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Tuning Optimization Algorithms Under Multiple Objective Function Evaluation Budgets

Published: 01 June 2015 Publication History

Abstract

Most sensitivity analysis studies of optimization algorithm control parameters are restricted to a single objective function evaluation (OFE) budget. This restriction is problematic because the optimality of control parameter values (CPVs) is dependent not only on the problem's fitness landscape, but also on the OFE budget available to explore that landscape. Therefore, the OFE budget needs to be taken into consideration when performing control parameter tuning. This paper presents a new algorithm tuning multiobjective particle swarm optimization (tMOPSO) for tuning the CPVs of stochastic optimization algorithms under a range of OFE budget constraints. Specifically, for a given problem tMOPSO aims to determine multiple groups of CPVs, each of which results in optimal performance at a different OFE budget. To achieve this, the control parameter tuning problem is formulated as a multiobjective optimization problem. Additionally, tMOPSO uses a noise-handling strategy and CPV assessment procedure, which are specialized for tuning stochastic optimization algorithms. Conducted numerical experiments provide evidence that tMOPSO is effective at tuning under multiple OFE budget constraints.

References

[1]
D. Wolpert and W. Macready, “No free lunch theorems for optimization,” IEEE Trans. Evol. Comput., vol. 1, no. 1, pp. 67–82, Apr. 1997.
[2]
K. Malan and A. Engelbrecht, “Quantifying ruggedness of continuous landscapes using entropy,” in Proc. IEEE CEC, Trondheim, Norway, 2009, pp. 1440–1447.
[3]
A. S. D. Dymond, A. P. Engelbrecht, and P. S. Heyns, “The sensitivity of single objective optimization algorithm control parameter values under different computational constraints,” in Proc. IEEE CEC, New Orleans, LA, USA, 2011, pp. 1412–1419.
[4]
S. K. Smit and A. E. Eiben, “Parameter tuning of evolutionary algorithms: Generalist vs. specialist,” Appl. Evol. Comput., Istanbul, Turkey, pp. 542–551, 2010.
[5]
S. K. Smit and A. E. Eiben, “Comparing parameter tuning methods for evolutionary algorithms,” in Proc. IEEE CEC, Trondheim, Norway, 2009, pp. 399–406.
[6]
J. Grefenstette, “Optimization of control parameters for genetic algorithms,” IEEE Trans. Syst., Man, Cybern., vol. 16, no. 1, pp. 122–128, Jan. 1986.
[7]
M. Birattari, T. Stützle, L. Paquete, and K. Varrentrapp, “A racing algorithm for configuring metaheuristics,” in Proc. GECCO, Citeseer, 2002, pp. 11–18.
[8]
T. Bartz-Beielstein, C. Lasarczyk, and M. Preuss, “Sequential parameter optimization,” in Proc. IEEE CEC, San Francisco, CA, USA, Sep. 2005, pp. 773–780.
[9]
V. Nannen and A. Eiben, “Relevance estimation and value calibration of evolutionary algorithm parameters,” in Proc. 20th IJCAI, San Francisco, CA, USA, 2007, pp. 975–980.
[10]
F. Hutter, H. Hoos, K. Leyton-Brown, and T. Stützle, “ParamILS: An automatic algorithm configuration framework,” J. Artif. Intell. Res., vol. 36, no. 1, pp. 267–306, 2009.
[11]
F. Hutter, H. Hoos, K. Leyton-Brown, and K. Murphy, “An experimental investigation of model-based parameter optimisation: SPO and beyond,” in Proc. 11th GECCO, Montreal, QC, Canada, 2009, pp. 271–278.
[12]
T. Wagner and S. Wessing, “On the effect of response transformations in sequential parameter optimization,” Evol. Comput., vol. 20, no. 2, pp. 229–248, 2012.
[13]
S. K. Smit, A. E. Eiben, and Z. Szlávik, “An MOEA-based method to tune EA parameters on multiple objective functions,” in Proc. IJCCI ICEC, 2010, pp. 261–268.
[14]
T. Bartz-Beielstein, K. Parsopoulos, and M. Vrahatis, “Design and analysis of optimization algorithms using computational statistics,” Appl. Num. Anal. Comput. Math., vol. 1, no. 2, pp. 413–433, 2004.
[15]
S. K. Smit and A. E. Eiben, “Beating the ‘world champion’ evolutionary algorithm via REVAC tuning,” in Proc. IEEE CEC, Barcelona, Spain, 2010, pp. 1–8.
[16]
M. López-Ibáñez and T. Stützle, “Automatic configuration of multi-objective ACO algorithms,” in Proc. ANTS, Brussels, Belgium, 2011, pp. 95–106.
[17]
Z. Yuan, M. De Oca, M. Birattari, and T. Stützle, “Modern continuous optimization algorithms for tuning real and integer algorithm parameters,” in Proc. ANTS, Brussels, Belgium, 2011, pp. 203–214.
[18]
A. Radulescu, M. López-Ibáñez, and T. Stützle, “Automatically improving the anytime behaviour of multiobjective evolutionary algorithms,” in Proc. EMO, Sheffield, U.K., 2013, pp. 825–840.
[19]
P. Balaprakash, M. Birattari, and T. Stützle, “Improvement strategies for the F-Race algorithm: Sampling design and iterative refinement,” in Proc. Hybrid Metaheuristics, Dortmund, Germany, 2007, pp. 108–122.
[20]
W. Conover, Practical Nonparametric Statistics. New York, NY, USA: Wiley, 1999.
[21]
J. Branke and J. Elomari, “Meta-optimization for parameter tuning with a flexible computing budget,” in Proc. 14th GECCO. Philadelphia, PA, USA, 2012, pp. 1245–1252.
[22]
J. Dréo, “Multi-criteria meta-parameter tuning for mono-objective stochastic metaheuristics,” in Proc. 2nd Int. Conf. Metaheuristics Nature Inspired Comput., 2008.
[23]
J. Dréo, “Using performance fronts for parameter setting of stochastic metaheuristics,” in Proc. 11th Annu. GECCO. Montreal, QC, Canada, 2009, pp. 2197–2200.
[24]
K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A fast and elitist multiobjective genetic algorithm: NSGA-II,” IEEE Trans. Evol. Comput., vol. 6, no. 2, pp. 182–197, Apr. 2002.
[25]
A. P. Engelbrecht, Computational Intelligence: An Introduction. New York, NY, USA: Wiley, 2007.
[26]
L. Bui, H. Abbass, and D. Essam, “Localization for solving noisy multi-objective optimization problems,” Evol. Comput., vol. 17, no. 3, pp. 379–409, 2009.
[27]
H. Beyer, “Evolutionary algorithms in noisy environments: Theoretical issues and guidelines for practice,” Comput. Meth. Appl. Mech. Eng., vol. 186, nos. 2–4, pp. 239–267, 2000.
[28]
O. Maron and A. Moore, “The racing algorithm: Model selection for lazy learners,” Artif. Intell. Rev., vol. 11, no. 1, pp. 193–225, 1997.
[29]
M. Pedersen, “Tuning & simplifying heuristical optimization,” Ph.D. dissertation, CEDG, School Eng. Sci., Univ. Southampton, Southampton, U.K., 2010.
[30]
S. Mostaghim and J. Teich, “Quad-trees: A data structure for storing pareto sets in multiobjective evolutionary algorithms with elitism,” in Proc. EMO, 2005, pp. 81–104.
[31]
A. Berry and P. Vamplew, “An efficient approach to unbounded bi-objective archives: Introducing the mak_tree algorithm,” in Proc. 8th GECCO. New York, NY, USA, 2006, p. 626.
[32]
C. Coello, G. Pulido, and M. Lechuga, “Handling multiple objectives with particle swarm optimization,” IEEE Trans. Evol. Comput., vol. 8, no. 3, pp. 256–279, Jun. 2004.
[33]
M. Sierra and C. Coello, “Improving PSO-based multi-objective optimization using crowding, mutation and $\epsilon $ -dominance,” in Proc. EMO, Guanajuato, Mexico, 2005, pp. 505–519.
[34]
P. Suganthan et al., “Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization,” Tech. Rep., Nanyang Technol. Univ., Singapore, AND KanGAL Report #2005005, IIT Kanpur, India, 2005.
[35]
Y. Wang, Z. Cai, and Q. Zhang, “Differential evolution with composite trial vector generation strategies and control parameters,” IEEE Trans. Evol. Comput., vol. 15, no. 1, pp. 55–66, Feb. 2011.
[36]
R. Storn and K. Price, “Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces,” J. Global Optim., vol. 11, no. 4, pp. 341–359, 1997.
[37]
S. Das and P. Suganthan, “Differential evolution: A survey of the state-of-the-art,” IEEE Trans. Evol. Comput., vol. 15, no. 1, pp. 4–31, Feb. 2011.
[38]
J. Zhang and A. Sanderson, “JADE: Adaptive differential evolution with optional external archive,” IEEE Trans. Evol. Comput., vol. 13, no. 5, pp. 945–958, Oct. 2009.
[39]
Y. Shi and R. Eberhart, “Parameter selection in particle swarm optimization,” in Proc. Evol. Program., San Diego, CA, USA, 1998, pp. 591–600.
[40]
M. Clerc and J. Kennedy, “The particle swarm-explosion, stability, and convergence in a multidimensional complex space,” IEEE Trans. Evol. Comput., vol. 6, no. 1, pp. 58–73, Feb. 2002.
[41]
N. Hansen and A. Ostermeier, “Completely derandomized self-adaptation in evolution strategies,” Evol. Comput., vol. 9, no. 2, pp. 159–195, 2001.
[42]
A. Auger and N. Hansen, “Performance evaluation of an advanced local search evolutionary algorithm,” in Proc. IEEE CEC, 2005, pp. 1777–1784.
[43]
M. López-Ibáñez, J. Dubois-Lacoste, T. Stützle, and M. Birattari, “The irace package, iterated race for automatic algorithm configuration,” IRIDIA, Université Libre de Bruxelles, Belgium, Tech. Rep. TR/IRIDIA/2011-004, 2011.
[44]
E. Zitzler, L. Thiele, M. Laumanns, C. Fonseca, and V. Da Fonseca, “Performance assessment of multiobjective optimizers: An analysis and review,” IEEE Trans. Evol. Comput., vol. 7, no. 2, pp. 117–132, Apr. 2003.
[45]
Q. Zhang, W. Liu, E. Tsang, and B. Virginas, “Expensive multiobjective optimization by MOEA/D with Gaussian process model,” IEEE Trans. Evol. Comput., vol. 14, no. 3, pp. 456–474, Jun. 2010.
[46]
A. S. D. Dymond, S. Kok, and P. S. Heyns, “The sensitivity of multi-objective optimization algorithm performance to objective function evaluation budgets,” in Proc. IEEE CEC, Cancun, Mexico, 2013, pp. 1868–1875.

Cited By

View all
  • (2021)Few-Shots Parallel Algorithm Portfolio Construction via Co-EvolutionIEEE Transactions on Evolutionary Computation10.1109/TEVC.2021.305966125:3(595-607)Online publication date: 1-Jun-2021
  • (2021)Automated parameter tuning as a bilevel optimization problem solved by a surrogate-assisted population-based approachApplied Intelligence10.1007/s10489-020-02151-y51:8(5978-6000)Online publication date: 1-Aug-2021

Index Terms

  1. Tuning Optimization Algorithms Under Multiple Objective Function Evaluation Budgets
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Please enable JavaScript to view thecomments powered by Disqus.

          Information & Contributors

          Information

          Published In

          cover image IEEE Transactions on Evolutionary Computation
          IEEE Transactions on Evolutionary Computation  Volume 19, Issue 3
          June 2015
          152 pages

          Publisher

          IEEE Press

          Publication History

          Published: 01 June 2015

          Author Tags

          1. objective function evaluation (OFE) budget
          2. Control parameter tuning
          3. multiobjective optimization

          Qualifiers

          • Research-article

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • Downloads (Last 12 months)0
          • Downloads (Last 6 weeks)0
          Reflects downloads up to 16 Dec 2024

          Other Metrics

          Citations

          Cited By

          View all
          • (2021)Few-Shots Parallel Algorithm Portfolio Construction via Co-EvolutionIEEE Transactions on Evolutionary Computation10.1109/TEVC.2021.305966125:3(595-607)Online publication date: 1-Jun-2021
          • (2021)Automated parameter tuning as a bilevel optimization problem solved by a surrogate-assisted population-based approachApplied Intelligence10.1007/s10489-020-02151-y51:8(5978-6000)Online publication date: 1-Aug-2021

          View Options

          View options

          Media

          Figures

          Other

          Tables

          Share

          Share

          Share this Publication link

          Share on social media