Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Algorithm selection for black-box continuous optimization problems

Published: 01 October 2015 Publication History

Abstract

Selecting the most appropriate algorithm to use when attempting to solve a black-box continuous optimization problem is a challenging task. Such problems typically lack algebraic expressions, it is not possible to calculate derivative information, and the problem may exhibit uncertainty or noise. In many cases, the input and output variables are analyzed without considering the internal details of the problem. Algorithm selection requires expert knowledge of search algorithm efficacy and skills in algorithm engineering and statistics. Even with the necessary knowledge and skills, success is not guaranteed.In this paper, we present a survey of methods for algorithm selection in the black-box continuous optimization domain. We start the review by presenting Rice's (1976) selection framework. We describe each of the four component spaces - problem, algorithm, performance and characteristic - in terms of requirements for black-box continuous optimization problems. This is followed by an examination of exploratory landscape analysis methods that can be used to effectively extract the problem characteristics. Subsequently, we propose a classification of the landscape analysis methods based on their order, neighborhood structure and computational complexity. We then discuss applications of the algorithm selection framework and the relationship between it and algorithm portfolios, hybrid meta-heuristics, and hyper-heuristics. The paper concludes with the identification of key challenges and proposes future research directions.

References

[1]
T. Abell, Y. Malitsky, K. Tierney, Features for exploiting black-box optimization problem structure, in: Proceedings of the 7th International Conference on Learning and Intelligence Optimization (LION7), Lect. Notes Comput. Sci., 2013, pp. 30-36.
[2]
S. Ali, K. Smith, On learning algorithm selection for classification, Appl. Soft Comput., 6 (2006) 119-138.
[3]
E. Anderson, Markov chain modelling of the solution surface in local search, J. Oper. Res. Soc., 53 (2002) 630-636.
[4]
C. Ansótegui, M. Sellmann, K. Tierney, A gender-based genetic algorithm for the automatic configuration of algorithms, in: Proceedings of the 15th International Conference on Principles and Practice of Constraint Programming, Springer-Verlag, Berlin, Heidelberg, 2009, pp. 142-157.
[5]
A. Auger, B. Doerr, Theory of Randomized Search Heuristics, World Scientific, 2011.
[6]
A. Auger, O. Teytaud, Continuous lunches are free plus the design of optimal optimization algorithms, Algorithmica, 57 (2010) 121-146.
[7]
T. Bartz-Beielstein, Springer, Berlin, Heidelberg, 2006.
[8]
T. Bartz-Beielstein, C. Lasarczyk, M. Preuíß, Sequential parameter optimization, in: The 2005 IEEE Congress on Evolutionary Computation, vol. 1, 2005, pp. 773-780.
[9]
J. Beck, E. Freuder, Simple rules for low-knowledge algorithm selection, in: Lect. Notes Comput. Sci., vol. 3011, Springer, 2004, pp. 50-64.
[10]
J. Beck, J. Watson, Adaptive search algorithms and fitness-distance correlation, in: Proceedings of the Fifth Metaheuristics International Conference, 2003, pp. 1-6.
[11]
J. Bergstra, Y. Bengio, Random search for hyper-parameter optimization, J. Mach. Learn. Res., 13 (2012) 281-305.
[12]
H. Beyer, H. Schwefel, Evolution strategies: a comprehensive introduction, Nat. Comput., 1 (2002) 3-52.
[13]
M. Birattari, Z. Yuan, P. Balaprakash, T. Stützle, F-Race and iterated F-Race: an overview, in: Experimental Methods for the Analysis of Optimization Algorithms, Springer, Berlin, Heidelberg, 2010, pp. 311-336.
[14]
B. Bischl, O. Mersmann, H. Trautmann, M. Preuíß, Algorithm selection based on exploratory landscape analysis and cost-sensitive learning, in: Proceedings of the Fourteenth International Conference on Genetic and Evolutionary Computation Conference, ACM, New York, NY, USA, 2012, pp. 313-320.
[15]
C. Blum, J. Puchinger, G. Raidl, A. Roli, Hybrid metaheuristics in combinatorial optimization: a survey, Appl. Soft Comput., 11 (2011) 4135-4151.
[16]
Y. Borenstein, R. Poli, Fitness distributions and GA hardness, in: Proceedings of Parallel Problem Solving from Nature (PPSN VIII), 2004, pp. 11-20.
[17]
C. Brooks, E. Durfee, Using landscape theory to measure learning difficulty for adaptive agents, in: Lect. Notes Comput. Sci., vol. 2636, Springer, 2003.
[18]
E. Burke, M. Hyde, G. Kendall, G. Ochoa, E. Ozcan, R. Qu, Hyper-heuristics: a survey of the state of the art, J. Oper. Res. Soc., 64 (2013) 1695-1724.
[19]
P. Caamaño, F. Bellas, J. Becerra, R. Duro, Evolutionary algorithm characterization in real parameter optimization problems, Appl. Soft Comput. (2013).
[20]
P. Caamaño, A. Prieto, J. Becerra, F. Bellas, R. Duro, Real-valued multimodal fitness landscape characterization for evolution, in: Lect. Notes Comput. Sci., vol. 6443, Springer, 2010, pp. 567-574.
[21]
M. Cavazzuti, Deterministic Optimization, Springer, Berlin, Heidelberg, 2013.
[22]
M. Cavazzuti, Stochastic Optimization, Springer, Berlin, Heidelberg, 2013.
[23]
X. Chen, Y. Ong, M. Lim, K. Tan, A multi-facet survey on memetic computation, IEEE Trans. Evol. Comput., 15 (2011) 591-607.
[24]
A. Conn, N. Gould, P. Toint, Trust Region Methods, Society for Industrial and Applied Mathematics, 2000.
[25]
A.D. Corte, K. Sörensen, Optimisation of gravity-fed water distribution network design: a critical review, Eur. J. Oper. Res., 228 (2013) 1-10.
[26]
C. Cortes, M. Mohri, M. Riley, A. Rostamizadeh, Sample selection bias correction theory, in: Lecture Notes in Computer Science, vol. 5254, Springer, Berlin, Heidelberg, 2008, pp. 38-53.
[27]
S. Das, P. Suganthan, Differential evolution: a survey of the state-of-the-art, IEEE Trans. Evol. Comput., 15 (2011) 4-31.
[28]
Y. Davidor, Epistasis variance: a viewpoint on ga-hardness, in: Foundations of Genetic Algorithms, Morgan Kauffmannn, 1991, pp. 23-35.
[29]
K. De Jong, Parameter setting in EAs: a 30 year perspective, in: Stud. Comput. Intell., vol. 54, Springer, 2005, pp. 1-18.
[30]
M. Dorigo, V. Maniezzo, A. Colorni, Ant system: an autocatalytic optimizing process, Tech. Rep. 91-016, Politecnico di Milano, 1991.
[31]
R. Eberhart, J. Kennedy, A new optimizer using particle swarm theory, in: Sixth International Symposium on Micro Machine and Human Science, 1995, pp. 39-43.
[32]
B. Efron, R. Tibshirani, An Introduction to the Bootstrap, Chapman & Hall, Inc., 1993.
[33]
A. Eiben, R. Hinterding, Z. Michalewicz, Parameter control in evolutionary algorithms, IEEE Trans. Evol. Comput., 3 (1999) 124-141.
[34]
A. Eiben, Z. Michalewicz, M. Schoenauer, J. Smith, Parameter control in evolutionary algorithms, in: Stud. Comput. Intell., vol. 54, Springer, 2005, pp. 19-46.
[35]
A. Eiben, S. Smit, Parameter tuning for configuring and analyzing evolutionary algorithms, Swarm Evol. Comput., 1 (2011) 19-31.
[36]
A. Eremeev, C. Reeves, On confidence intervals for the number of local optima, in: Lect. Notes Comput. Sci., vol. 2611, Springer, 2003.
[37]
C. Fonlupt, D. Robilliard, P. Preux, A bit-wise epistasis measure for binary search spaces, in: Proceedings of Parallel Problem Solving from Nature (PPSN V), Lect. Notes Comput. Sci., vol. 1498, 1998, pp. 47-56.
[38]
O. Francois, C. Lavergne, Design of evolutionary algorithms - a statistical perspective, IEEE Trans. Evol. Comput., 5 (2001) 129-148.
[39]
M. Gagliolo, J. Schmidhuber, Learning dynamic algorithm portfolios, Ann. Math. Artif. Intel., 47 (2006) 295-328.
[40]
J. Garnier, L. Kallel, Efficiency of local search with multiple local optima, SIAM J. Discrete Math., 15 (2002) 122-141.
[41]
C. Gomes, B. Selman, Algorithm portfolios, Artif. Intell., 126 (2001) 43-62.
[42]
C. Gomes, B. Selman, N. Crato, H. Kautz, Heavy-tailed phenomena in satisfiability and constraint satisfaction problems, J. Autom. Reason., 24 (2000) 67-100.
[43]
N. Gould, D. Orban, P. Toint, CUTEst: a constrained and unconstrained testing environment with safe threads, Tech. Rep. RAL-TR-2013-005, Science and Technology Facilities Council, 2013.
[44]
M. Graff, H. Escalante, J. Cerda-Jacobo, A. Gonzalez, Models of performance of time series forecasters, Neurocomputing, 122 (2013) 375-385.
[45]
M. Graff, R. Poli, Practical performance models of algorithms in evolutionary program induction and other domains, Artif. Intell., 174 (2010) 1254-1276.
[46]
M. Graff, R. Poli, J. Flores, Models of performance of evolutionary program induction algorithms based on indicators of problem difficulty, Evol. Comput., 21 (2013) 533-560.
[47]
J. Grobler, A. Engelbrecht, G. Kendall, V. Yadavalli, Alternative hyper-heuristic strategies for multi-method global optimization, in: Proceedings of the IEEE Congress on Evolutionary Computation (CEC2010), 2010, pp. 1-8.
[48]
N. Hansen, A. Auger, S. Finck, R. Ros, Real-parameter black-box optimization benchmarking BBOB-2010: Experimental setup, Tech. Rep. RR-7215, INRIA, September 2010.
[49]
N. Hansen, A. Auger, R. Ros, S. Finck, P. Pošík, Comparing results of 31 algorithms from the black-box optimization benchmarking BBOB-2009, in: Genetic and Evolutionary Computation Conference, 2011, pp. 1689-1696.
[50]
N. Hansen, R. Ros, N. Mauny, M. Schoenauer, A. Auger, Impacts of invariance in search: when CMA-ES and PSO face ill-conditioned and non-separable problems, Appl. Soft Comput., 11 (2011) 5755-5769.
[51]
E. Hart, J. Timmis, Application areas of ais: the past, the present and the future, Appl. Soft Comput., 8 (2008) 191-201.
[52]
M. Hauschild, M. Pelikan, An introduction and survey of estimation of distribution algorithms, Swarm Evol. Comput., 1 (2011) 111-128.
[53]
J. He, C. Reeves, C. Witt, X. Yao, A note on problem difficulty measures in black-box optimization: classification, realizations and predictability, Evol. Comput., 15 (2007) 435-443.
[54]
R. Heckendorn, D. Whitley, Predicting epistasis from mathematical models, Evol. Comput., 7 (1999) 69-101.
[55]
D. Henderson, S. Jacobson, A. Johnson, The theory and practice of simulated annealing, in: International Series in Operations Research & Management Science, vol. 57, Springer, US, 2003, pp. 287-319.
[56]
M. Hilario, A. Kalousis, P. Nguyen, A. Woznica, A data mining ontology for algorithm selection and meta-mining, in: Third Generation Data Mining: Towards Service-Oriented Knowledge Discovery, 2009, pp. 76-87.
[57]
H. Hoos, T. Stützle, Morgan Kaufman, San Francisco, 2005.
[58]
H. Hoos, T. Stützle, Morgan Kaufmann, San Francisco, 2005.
[59]
P. Hough, P. Williams, Modern machine learning for automatic optimization algorithm selection, in: Proceedings of the INFORMS Artificial Intelligence and Data Mining Workshop, 2006, pp. 1-6.
[60]
M. Houle, H.-P. Kriegel, P. Kröger, E. Schubert, A. Zimek, Can shared-neighbor distances defeat the curse of dimensionality?, in: Lect. Notes Comput. Sci., vol. 6187, Springer, 2010, pp. 482-500.
[61]
A. Howe, E. Dahlman, C. Hansen, M. Scheetz, A. Mayrhauser, Exploiting competitive planner performance, in: Lecture Notes in Computer Science, vol. 1809, Springer, Berlin, Heidelberg, 2000, pp. 62-72.
[62]
F. Hutter, Y. Hamadi, H. Hoos, K. Leyton-Brown, Performance prediction and automated tuning of randomized and parametric algorithms, in: Lect. Notes Comput. Sci., vol. 4204, Springer, 2006, pp. 213-228.
[63]
F. Hutter, H. Hoos, K. Leyton-Brown, Sequential model-based optimization for general algorithm configuration, in: Lecture Notes in Computer Science, vol. 6683, Springer, Berlin, Heidelberg, 2011, pp. 507-523.
[64]
F. Hutter, H. Hoos, K. Leyton-Brown, T. Stützle, ParamILS: an automatic algorithm configuration framework, J. Artif. Intell. Res., 36 (2009) 267-306.
[65]
F. Hutter, L. Xu, H. Hoos, K. Leyton-Brown, Algorithm runtime prediction: methods & evaluation, Artif. Intell., 206 (2014) 79-111.
[66]
T. Jansen, On classifications of fitness functions, Tech. Rep. CI-76/99, University of Dortmund, November 1999.
[67]
T. Jansen, Analyzing Evolutionary Algorithms, Springer, Berlin, Heidelberg, 2013.
[68]
T. Jones, S. Forrest, Fitness distance correlation as a measure of problem difficulty for genetic algorithms, in: Proceedings of the Sixth International Conference on Genetic Algorithms, Morgan Kaufmann Publishers Inc., 1995, pp. 184-192.
[69]
S. Kadioglu, Y. Malitsky, M. Sellmann, K. Tierney, Isac - instance-specific algorithm configuration, in: Proceedings of the 2010 Conference on ECAI 2010: 19th European Conference on Artificial Intelligence, IOS Press, Amsterdam, The Netherlands, 2010, pp. 751-756.
[70]
J. Kanda, A. Carvalho, E. Hruschka, C. Soares, Selection of algorithms to solve traveling salesman problems using meta-learning, Int. J. Hybrid Intell. Syst., 8 (2011) 117-128.
[71]
D. Karaboga, B. Akay, A survey: algorithms simulating bee swarm intelligence, Artif. Intell. Rev., 31 (2009) 61-85.
[72]
D. Karnopp, Random search techniques for optimization problems, Automatica, 1 (1963) 111-121.
[73]
L. Kotthoff, Algorithm selection for combinatorial search problems: a survey, AI Magazine, 35 (2014).
[74]
K. Leyton-Brown, E. Nudelman, Y. Shoham, Learning the empirical hardness of optimization problems: the case of combinatorial auctions, in: Lect. Notes Comput. Sci., vol. 2470, Springer, 2002, pp. 91-100.
[75]
K. Leyton-Brown, E. Nudelman, Y. Shoham, Empirical hardness models: methodology and a case study on combinatorial auctions, J. ACM, 56 (2009) 22:1-22:52.
[76]
E. Leyva, Y. Caises, A. González, R. Pérez, On the use of meta-learning for instance selection: an architecture and an experimental study, Inform. Sci., 266 (2014) 16-30.
[77]
E. Leyva, A. González, R. Pérez, Knowledge-based instance selection: a compromise between efficiency and versatility, Knowl.-Based Syst., 47 (2013) 65-76.
[78]
X. Li, A. Engelbrecht, M. Epitropakis, Benchmark functions for CEC'2013 special session and competition on niching methods for multimodal function optimization, Tech. rep., RMIT University, 2013.
[79]
X. Li, K. Tang, M. Omidvar, Z. Yang, K. Qin, H. China, Benchmark functions for the CEC'2013 special session and competition on large-scale global optimization, Tech. rep., RMIT University, 2013.
[80]
Y. Liu, X. Yao, Q. Zhao, T. Higuchi, Scaling up fast evolutionary programming with cooperative coevolution, in: Proceedings of the 2001 Congress on Evolutionary Computation, vol. 2, Ieee, 2001, pp. 1101-1108.
[81]
M. Lozano, D. Molina, F. Herrera, Editorial scalability of evolutionary algorithms and other metaheuristics for large-scale continuous optimization problems, Soft Comput., 15 (2011) 2085-2087.
[82]
G. Lu, J. Li, Y. Yao, Fitness-probability cloud and a measure of problem hardness for evolutionary algorithms, in: Lect. Notes Comput. Sci., Springer, 2011, pp. 108-117.
[83]
M. Lunacek, D. Whitley, The dispersion metric and the CMA evolution strategy, in: Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, ACM, New York, NY, USA, 2006, pp. 477-484.
[84]
R. Luus, T. Jaakola, Optimization by direct search and systematic reduction of the size of search region, AIChE J., 19 (1973) 760-766.
[85]
K. Malan, A. Engelbrecht, Quantifying ruggedness of continuous landscapes using entropy, in: Proceedings of the 2009 IEEE Congress on Evolutionary Computation (CEC2009), 2009, pp. 1440-1447.
[86]
K. Malan, A. Engelbrecht, A survey of techniques for characterising fitness landscapes and some possible ways forward, Inform. Sci., 241 (2013) 148-163.
[87]
Y. Malitsky, M. Sellmann, Instance-specific algorithm configuration as a method for non-model-based portfolio generation, in: Lecture Notes in Computer Science, vol. 7298, Springer, Berlin, Heidelberg, 2012, pp. 244-259.
[88]
Y. Malitsky, A. Sabharwal, H. Samulowitz, M. Sellmann, Non-model-based algorithm portfolios for SAT, in: Lecture Notes in Computer Science, vol. 6695, Springer, Berlin, Heidelberg, 2011, pp. 369-370.
[89]
Y. Malitsky, A. Sabharwal, H. Samulowitz, M. Sellmann, Algorithm portfolios based on cost-sensitive hierarchical clustering, in: IJCAI'13, AAAI Press, 2013, pp. 608-614.
[90]
J. Marin, How landscape ruggedness influences the performance of real-coded algorithms: a comparative study, Soft Comput., 16 (2012) 683-698.
[91]
O. Mersmann, B. Bischl, H. Trautmann, M. Preuíß, C. Weihs, G. Rudolph, Exploratory landscape analysis, in: GECCO '11, ACM, New York, NY, USA, 2011, pp. 829-836.
[92]
O. Mersmann, M. Preuíß, H. Trautmann, Benchmarking evolutionary algorithms: towards exploratory landscape analysis, in: Lect. Notes Comput. Sci., vol. 6238, Springer, 2010, pp. 73-82.
[93]
P. Merz, Advanced fitness landscape analysis and the performance of memetic algorithms, Evol. Comput., 12 (2004) 303-325.
[94]
Z. Michalewicz, Quo vadis, evolutionary computation? on a growing gap between theory and practice, in: No. 73 11 in Lect. Notes Comput. Sci., Springer, 2012, pp. 98-121.
[95]
C. Müller, I. Sbalzarini, Global characterization of the CEC 2005 fitness landscapes using fitness-distance analysis, in: Lect. Notes Comput. Sci., vol. 6624, Springer, 2011, pp. 294-303.
[96]
M. Montes de Oca, T. Stutzle, M. Birattari, M. Dorigo, Frankenstein's PSO: a composite particle swarm optimization algorithm, IEEE Trans. Evol. Comput., 13 (2009) 1120-1132.
[97]
A. Moraglio, Towards a geometric unification of evolutionary algorithms, Ph.D. thesis, University of Essex, 2007.
[98]
R. Morgan, M. Gallagher, Length scale for characterising continuous optimization problems, in: Proceedings of Parallel Problem Solving from Nature (PPSN XII), Lect. Notes Comput. Sci., 2012, pp. 407-416.
[99]
R. Morgan, M. Gallagher, Sampling techniques and distance metrics in high dimensional continuous landscape analysis: limitations and improvements, IEEE Trans. Evol. Comput., PP (2013).
[100]
M. Muñoz, M. Kirley, S. Halgamuge, The algorithm selection problem on the continuous optimization domain, in: Studies Comput. Intell., vol. 445, Springer, 2012, pp. 75-89.
[101]
M. Muñoz, M. Kirley, S. Halgamuge, Landscape characterization of numerical optimization problems using biased scattered data, in: Proceedings of the 2012 Congress on Evolutionary Computation (CEC2012), 2012, pp. 1-8.
[102]
M. Muñoz, M. Kirley, S. Halgamuge, A meta-learning prediction model of algorithm performance for continuous optimization problems, in: Proceedings of Parallel Problem Solving from Nature (PPSN XII), Lect. Notes Comput. Sci., vol. 7941, 2012, pp. 226-235.
[103]
M. Muñoz, M. Kirley, S. Halgamuge, Exploratory landscape analysis of continuous space optimization problems using information content, IEEE Trans. Evol. Comput., 19 (2015) 74-87.
[104]
B. Naudts, L. Kallel, A comparison of predictive measures of problem difficulty in evolutionary algorithms, IEEE Trans. Evol. Comput., 4 (2000) 1-15.
[105]
B. Naudts, D. Suys, A. Verschoren, Epistasis as a basic concept in formal landscape analysis, in: Proceedings of the 7th International Conference on Genetic Algorithms, Morgan Kaufmann, 1997, pp. 65-72.
[106]
F. Neumann, C. Witt, Bioinspired Computation in Combinatorial Optimization, Springer, Berlin, Heidelberg, 2010.
[107]
J. Nocedal, S. Wright, Conjugate gradient methods, in: Springer Series in Operations Research and Financial Engineering, Springer, New York, 2006, pp. 101-134.
[108]
J. Nocedal, S. Wright, Fundamentals of unconstrained optimization, in: Springer Series in Operations Research and Financial Engineering, Springer, New York, 2006, pp. 10-29.
[109]
J. Nocedal, S. Wright, Quasi-newton methods, in: Springer Series in Operations Research and Financial Engineering, Springer, New York, 2006, pp. 135-163.
[110]
J. Nocedal, S. Wright, Trust-region methods, in: Springer Series in Operations Research and Financial Engineering, Springer, New York, 2006, pp. 66-100.
[111]
P. Oliveto, X. Yao, Runtime analysis of evolutionary algorithms for discrete optimization, in: Theory of Randomized Search Heuristics, World Scientific, 2011, pp. 21-52.
[112]
G. Pappa, G. Ochoa, M.R. Hyde, A. Freitas, J. Woodward, J. Swan, Contrasting meta-learning and hyper-heuristic research: the role of evolutionary algorithms, Genet. Program. Evolvable Mach. (2013) 1-33.
[113]
K. Passino, Biomimicry of bacterial foraging for distributed optimization and control, IEEE Control Syst. Mag., 22 (2002) 52-67.
[114]
M. Pedersen, Tuning & simplifying heuristical optimization, Ph.D. thesis, University of Southampton, 2009.
[115]
F. Peng, K. Tang, G. Chen, X. Yao, Population-based algorithm portfolios for numerical optimization, IEEE Trans. Evol. Comput., 14 (2010) 782-800.
[116]
A.P. Piotrowski, J.J. Napiorkowski, P.M. Rowinski, How novel is the "novel" black hole optimization approach?, Inform. Sci., 267 (2014) 191-200.
[117]
E. Pitze, M. Affenzeller, A. Beham, S. Wagner, Comprehensive and automatic fitness landscape analysis using heuristiclab, in: EUROCAST 2011, Lect. Notes Comput. Sci., vol. 6927, 2012, pp. 424-431.
[118]
E. Pitzer, M. Affenzeller, A comprehensive survey on fitness landscape analysis, in: Studies Comput. Intell., vol. 378, Springer, Berlin, Heidelberg, 2012, pp. 161-191.
[119]
R. Poli, L. Vanneschi, W. Langdon, N. McPhee, Theoretical results in genetic programming: the next ten years?, Genet. Program. Evolvable Mach., 11 (2010) 285-320.
[120]
M. Preuíß, C. Stoean, R. Stoean, Niching foundations: basin identification on fixed-property generated landscapes, in: GECCO '11, ACM, New York, NY, USA, 2011, pp. 837-844.
[121]
K. Price, Differential evolution vs. the functions of the 2nd ICEO, in: Proceedings of the IEEE International Conference on Evolutionary Computation, 1997, pp. 153-157.
[122]
C. Reeves, Fitness landscapes, in: Search Methodologies, Springer, 2005, pp. 587-610.
[123]
C. Reeves, A. Eremeev, Statistical analysis of local search landscapes, J. Oper. Res. Soc., 55 (2004) 687-693.
[124]
C. Reeves, C. Wright, An experimental design perspective on genetic algorithms, in: Foundations of Genetic Algorithms 3, Morgan Kaufmann, 1995, pp. 7-22.
[125]
J. Rice, The algorithm selection problem, in: Advances in Computers, vol. 15, Elsevier, 1976, pp. 65-118.
[126]
J. Rice, Methodology for the algorithm selection problem, in: Proceedings of the IFIP TC 2.5 Working Conference on Performance Evaluation of Numerical Software, 1979, pp. 301-307.
[127]
S. Rochet, M. Slimane, G. Venturini, Epistasis for real encoding in genetic algorithms, in: Australian and New Zealand Conference on Intelligent Information Systems, 1996, pp. 268-271.
[128]
S. Rochet, G. Venturini, M. Slimane, E. El Kharoubi, A critical and empirical study of epistasis measures for predicting ga performances: a summary, in: Third European Conference on Artificial Evolution, 1998, pp. 275-285.
[129]
R. Ros, Real-parameter black-box optimisation: Benchmarking and designing algorithms, Ph.D. thesis, Universite Paris-Sud, December 2009.
[130]
H. Rosé, W. Ebeling, T. Asselmeyer, The density of states - a measure of the difficulty of optimisation problems, in: Lect. Notes Comput. Sci., vol. 1141, Springer, Berlin, Heidelberg, 1996, pp. 208-217.
[131]
J. Schmee, G. Hahn, A simple method for regression analysis with censored data, Technometrics, 21 (1979) 417-432.
[132]
D. Seo, B. Moon, An information-theoretic analysis on the interactions of variables in combinatorial optimization problems, Evol. Comput., 15 (2007) 169-198.
[133]
D. Simon, R. Rarick, M. Ergezer, D. Du, Analytical and numerical comparisons of biogeography-based optimization and genetic algorithms, Inform. Sci., 181 (2011) 1224-1248.
[134]
J. Smith, T. Fogarty, Operator and parameter adaptation in genetic algorithms, Soft Comput., 1 (1997) 81-87.
[135]
T. Smith, P. Husbands, P. Layzell, M. O'Shea, Fitness landscapes and evolvability, Evol. Comput., 10 (2002) 1-34.
[136]
K. Smith-Miles, Towards insightful algorithm selection for optimisation using meta-learning concepts, in: IEEE International Joint Conference on Neural Networks, 2008 (IJCNN 2008), (IEEE World Congress on Computational Intelligence), 2008, pp. 4118-4124.
[137]
K. Smith-Miles, Cross-disciplinary perspectives on meta-learning for algorithm selection, ACM Comput. Surv., 41 (2009) 6:1-6:25.
[138]
K. Smith-Miles, D. Baatar, B. Wreford, R. Lewis, Towards objective measures of algorithm performance across instance space, Comput. Oper. Res., 45 (2014) 12-24.
[139]
K. Smith-Miles, R. James, J. Giffin, Y. Tu, A knowledge discovery approach to understanding relationships between scheduling problem structure and heuristic performance, in: Lect. Notes Comput. Sci., vol. 5851, Springer, 2009, pp. 89-103.
[140]
K. Smith-Miles, J. van Hemert, Discovering the suitability of optimisation algorithms by learning from evolved instances, Ann. Math. Artif. Intel., 61 (2011) 87-104.
[141]
K. Smith-Miles, B. Wreford, L. Lopes, N. Insani, Predicting metaheuristic performance on graph coloring problems using data mining, in: Studies Comput. Intell., vol. 434, Springer, Berlin, Heidelberg, 2013, pp. 417-432.
[142]
K. Sörensen, Metaheuristics-the metaphor exposed, Int. T. Oper. Res., 22 (2015) 3-18.
[143]
P. Stadler, Landscapes and their correlation functions, J. Math. Chem., 20 (1996) 1-45.
[144]
K. Steer, A. Wirth, S. Halgamuge, Information theoretic classification of problems for metaheuristics, in: Lect. Notes Comput. Sci., vol. 5361, Springer, 2008, pp. 319-328.
[145]
C. Stephens, R. Poli, EC theory - "in theory", in: Genet. Evol. Comput. Ser., vol. 11, Springer, 2004, pp. 129-155.
[146]
P. Suganthan, N. Hansen, J. Liang, K. Deb, Y. Chen, A. Auger, S. Tiwari, Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization, Tech. rep., NTU, Singapore and IIT, Kanpur, 2005. <http://www.bionik.tu-berlin.de/user/niko/Tech-Report-May-30-05.pdf>.
[147]
K. Tang, F. Peng, G. Chen, X. Yao, Population-based algorithm portfolios with automated constituent algorithms selection, Inform. Sci., 279 (2014) 94-104.
[148]
C. Thornton, F. Hutter, H. Hoos, K. Leyton-Brown, Auto-WEKA: combined selection and hyperparameter optimization of classification algorithms, in: Proceedings of the 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, ACM, New York, NY, USA, 2013, pp. 847-855.
[149]
M. Tomassini, L. Vanneschi, P. Collard, M. Clergue, A study of fitness distance correlation as a difficulty measure in genetic programming, Evol. Comput., 13 (2005) 213-239.
[150]
G. Uludag, A. Sima Uyar, Fitness landscape analysis of differential evolution algorithms, in: Fifth International Conference on Soft Computing, Computing with Words and Perceptions in System Analysis, Decision and Control, ICSCCW 2009, 2009, pp. 1-4.
[151]
L. Vanneschi, M. Clergue, P. Collard, M. Tomassini, S. Vérel, Fitness clouds and problem hardness in genetic programming, in: Lect. Notes Comput. Sci., vol. 3103, Springer, 2004, pp. 690-701.
[152]
L. Vanneschi, M. Tomassini, P. Collard, M. Clergue, Fitness distance correlation in structural mutation genetic programming, in: Lect. Notes Comput. Sci., vol. 2610, Springer, 2003, pp. 455-464.
[153]
V. Vassilev, T. Fogarty, J. Miller, Information characteristics and the structure of landscapes, Evol. Comput., 8 (2000) 31-60.
[154]
V. Vassilevska, R. Williams, S. Woo, Confronting hardness using a hybrid approach, in: Proceedings of the Seventeenth Annual ACM-SIAM Symposium on Discrete Algorithm, ACM, New York, NY, USA, 2006, pp. 1-10.
[155]
M. ¿repinšek, S. Liu, L. Mernik, A note on teaching-learning-based optimization algorithm, Inform. Sci., 212 (2012) 79-93.
[156]
J. Vrugt, B. Robinson, J. Hyman, Self-adaptive multimethod search for global optimization in real-parameter spaces, IEEE Trans. Evol. Comput., 13 (2009) 243-259.
[157]
G. Wang, Q. Song, H. Sun, X. Zhang, B. Xu, Y. Zhou, A feature subset selection algorithm automatic recommendation method, J. Artif. Intell. Res., 47 (2013) 1-34.
[158]
J. Watson, J. Beck, A. Howe, L. Whitley, Problem difficulty for tabu search in job-shop scheduling, Artif. Intell., 143 (2003) 189-217.
[159]
J. Watson, A. Howe, Focusing on the individual: Why we need new empirical methods for characterizing problem difficulty, Working Notes of ECAI 2000 Workshop on Empirical Methods in Artificial Intelligence, August 2000.
[160]
E. Weinberger, Correlated and uncorrelated fitness landscapes and how to tell the difference, Biol. Cybern., 63 (1990) 325-336.
[161]
E. Weinberger, P. Stadler, Why some fitness landscapes are fractal, J. Theor. Biol., 163 (1993) 255-275.
[162]
T. Weise, M. Zapf, R. Chiong, A. Nebro, Why is optimization difficult?, in: Stud. Comput. Intell., vol. 193, Springer, 2009, pp. 1-50.
[163]
D. Weyland, A rigorous analysis of the harmony search algorithm: how the research community can be misled by a "novel" methodology, Int. J. Appl. Metaheur. Comput., 1 (2010) 50-60.
[164]
D. Whitley, A genetic algorithm tutorial, Stat. Comput., 4 (1994) 65-85.
[165]
D. Wolpert, W. Macready, No free lunch theorems for optimization, IEEE Trans. Evol. Comput., 1 (1997) 67-82.
[166]
L. Xu, F. Hutter, H. Hoos, K. Leyton-Brown, SATzilla: portfolio-based algorithm selection for SAT, J. Artif. Intell. Res., 32 (2008) 565-606.
[167]
L. Xu, F. Hutter, H. Hoos, K. Leyton-brown, Satzilla2009: an automatic algorithm portfolio for sat. solver description, in: 2009 SAT Competition, 2009.
[168]
L. Xu, F. Hutter, H. Hoos, K. Leyton-Brown, Hydra-MIP: automated algorithm configuration and selection for mixed integer programming, in: Proceedings of the 18th RCRA Workshop on Experimental Evaluation of Algorithms for Solving Problems with Combinatorial Explosion, 2011.
[169]
L. Xu, F. Hutter, H. Hoos, K. Leyton-Brown, Evaluating component solver contributions to portfolio-based algorithm selectors, in: Lecture Notes in Computer Science, vol. 7317, Springer, Berlin, Heidelberg, 2012, pp. 228-241.
[170]
B. Zadrozny, Learning and evaluating classifiers under sample selection bias, in: Proceedings of the Twenty-first International Conference on Machine Learning, ACM, New York, NY, USA, 2004, pp. 114.
[171]
M. Zakynthinaki, J. Stirling, C.C. Martínez, A.L.D. de Durana, M.S. Quintana, G.R. Romo, J.S. Molinuevo, Modeling the basin of attraction as a two-dimensional manifold from experimental data: applications to balance in humans, Chaos, 20 (2010) 013119.

Cited By

View all
  • (2024)Generating Cheap Representative Functions for Expensive Automotive Crashworthiness OptimizationACM Transactions on Evolutionary Learning and Optimization10.1145/3646554Online publication date: 13-Feb-2024
  • (2024)Analyzing Violation Landscapes Using Different Definitions of Constraint ViolationProceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3638530.3664118(1815-1823)Online publication date: 14-Jul-2024
  • (2024)Towards an Improved Understanding of Features for More Interpretable Landscape AnalysisProceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3638530.3654301(135-138)Online publication date: 14-Jul-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Information Sciences: an International Journal
Information Sciences: an International Journal  Volume 317, Issue C
October 2015
369 pages

Publisher

Elsevier Science Inc.

United States

Publication History

Published: 01 October 2015

Author Tags

  1. Algorithm selection
  2. Black-box continuous optimization
  3. Empirical performance models
  4. Exploratory landscape analysis
  5. Performance prediction
  6. Problem hardness measures

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 20 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Generating Cheap Representative Functions for Expensive Automotive Crashworthiness OptimizationACM Transactions on Evolutionary Learning and Optimization10.1145/3646554Online publication date: 13-Feb-2024
  • (2024)Analyzing Violation Landscapes Using Different Definitions of Constraint ViolationProceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3638530.3664118(1815-1823)Online publication date: 14-Jul-2024
  • (2024)Towards an Improved Understanding of Features for More Interpretable Landscape AnalysisProceedings of the Genetic and Evolutionary Computation Conference Companion10.1145/3638530.3654301(135-138)Online publication date: 14-Jul-2024
  • (2024)The Lunar Lander Landing Site Selection Benchmark Reexamined: Problem Characterization and Algorithm PerformanceProceedings of the Genetic and Evolutionary Computation Conference10.1145/3638529.3654229(1381-1389)Online publication date: 14-Jul-2024
  • (2024)A learned cost model for big data query processingInformation Sciences: an International Journal10.1016/j.ins.2024.120650670:COnline publication date: 1-Jun-2024
  • (2024)3D meta-classificationInformation Sciences: an International Journal10.1016/j.ins.2024.120272662:COnline publication date: 1-Mar-2024
  • (2024)Landscape-Aware Automated Algorithm Configuration Using Multi-output Mixed Regression and ClassificationParallel Problem Solving from Nature – PPSN XVIII10.1007/978-3-031-70068-2_6(87-104)Online publication date: 14-Sep-2024
  • (2024)Hybridizing Target- and SHAP-Encoded Features for Algorithm Selection in Mixed-Variable Black-Box OptimizationParallel Problem Solving from Nature – PPSN XVIII10.1007/978-3-031-70068-2_10(154-169)Online publication date: 14-Sep-2024
  • (2023)Exploring structural similarity in fitness landscapes via graph data miningProceedings of the Thirty-Second International Joint Conference on Artificial Intelligence10.24963/ijcai.2023/621(5595-5603)Online publication date: 19-Aug-2023
  • (2023)Empirical Loss Landscape Analysis of Neural Network Activation FunctionsProceedings of the Companion Conference on Genetic and Evolutionary Computation10.1145/3583133.3596321(2029-2037)Online publication date: 15-Jul-2023
  • Show More Cited By

View Options

View options

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media