Abstract
Real-world optimization scenarios under uncertainty and no-ise are typically handled with robust optimization techniques, which re-formulate the original optimization problem into a robust counterpart, e.g., by taking an average of the function values over different perturbations to a specific input. Solving the robust counterpart instead of the original problem can significantly increase the associated computational cost, which is often overlooked in the literature to the best of our knowledge. Such an extra cost brought by robust optimization might depend on the problem landscape, the dimensionality, the severity of the uncertainty, and the formulation of the robust counterpart. This paper targets an empirical approach that evaluates and compares the computational cost brought by different robustness formulations in Kriging-based optimization on a wide combination (300 test cases) of problems, uncertainty levels, and dimensions. We mainly focus on the CPU time taken to find the robust solutions, and choose five commonly-applied robustness formulations: “mini-max robustness”, “mini-max regret robustness”, “expectation-based robustness”, “dispersion-based robustness”, and “composite robustness” respectively. We assess the empirical performance of these robustness formulations in terms of a fixed budget and a fixed target analysis, from which we find that “mini-max robustness” is the most practical formulation w.r.t. the associated computational cost.
This research has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement number 766186.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
An underlying assumption in this study is the non-existence of hard constraints on the choice of RF. In some practical scenarios of RO, this assumption is not valid. Note, however, that, there are plenty of situations where the assumption is valid, and the lack of information on the computational aspects of the RFs makes it harder for practitioners to choose a suitable robustness criterion.
- 2.
The robust or effective function response has already been defined in Sect. 2 for five of the most common RFs.
- 3.
The source code to reproduce the experimental setup and results is available at: https://github.com/SibghatUllah13/UllahPPSN2022.
References
Bal, H., et al.: A medium-scale distributed system for computer science research: infrastructure for the long term. Computer 49(5), 54–63 (2016)
Ben-Tal, A., Ghaoui, L.E., Nemirovski, A.: Robust Optimization, Princeton Series in Applied Mathematics, vol. 28. Princeton University Press (2009)
Beyer, H.G.: Evolutionary algorithms in noisy environments: Theoretical issues and guidelines for practice. Comput. Methods Appl. Mech. Eng. 186(2–4), 239–267 (2000)
Beyer, H.G., Sendhoff, B.: Robust optimization-a comprehensive survey. Comput. Methods Appl. Mech. Eng. 196(33–34), 3190–3218 (2007)
Genton, M.G.: Classes of kernels for machine learning: a statistics perspective. J. Mach. Learn. Res. 2, 299–312 (2001)
Hansen, N., Auger, A., Mersmann, O., Tusar, T., Brockhoff, D.: COCO: a platform for comparing continuous optimizers in a black-box setting. CoRR abs/1603.08785 (2016)
Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13(4), 455–492 (1998)
Jurecka, F.: Robust design optimization based on metamodeling techniques. Ph.D. thesis, Technische Universität München (2007)
Kruisselbrink, J.W.: Evolution strategies for robust optimization. Leiden Institute of Advanced Computer Science (LIACS), Faculty of Science (2012)
Pedregosa, F., et al.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)
ur Rehman, S., Langelaar, M., van Keulen, F.: Efficient kriging-based robust optimization of unconstrained problems. J. Comput. Sci. 5(6), 872–881 (2014)
Stein, M.: Large sample properties of simulations using Latin hypercube sampling. Technometrics 29(2), 143–151 (1987)
Taguchi, G., Konishi, S.: Taguchi Methods: Orthogonal Arrays and Linear Graphs. Tools for Quality Engineering. American Supplier Institute Dearborn, MI (1987)
Taguchi, G., Phadke, M.S.: Quality engineering through design optimization. In: Dehnad, K. (eds.) Quality Control, Robust Design, and the Taguchi Method, pp. 77–96. Springer, Boston (1989). https://doi.org/10.1007/978-1-4684-1472-1_5
Ullah, S., Nguyen, D.A., Wang, H., Menzel, S., Sendhoff, B., Bäck, T.: Exploring dimensionality reduction techniques for efficient surrogate-assisted optimization. In: 2020 IEEE Symposium Series on Computational Intelligence (SSCI), pp. 2965–2974. IEEE (2020)
Ullah, S., Wang, H., Menzel, S., Sendhoff, B., Back, T.: An empirical comparison of meta-modeling techniques for robust design optimization. In: 2019 IEEE Symposium Series on Computational Intelligence (SSCI), pp. 819–828. IEEE (2019)
Ullah, S., Wang, H., Menzel, S., Sendhoff, B., Bäck, T.: A new acquisition function for robust bayesian optimization of unconstrained problems. In: Proceedings of the Genetic and Evolutionary Computation Conference Companion, pp. 1344–1345 (2021)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Ullah, S., Wang, H., Menzel, S., Sendhoff, B., Bäck, T. (2022). A Systematic Approach to Analyze the Computational Cost of Robustness in Model-Assisted Robust Optimization. In: Rudolph, G., Kononova, A.V., Aguirre, H., Kerschke, P., Ochoa, G., Tušar, T. (eds) Parallel Problem Solving from Nature – PPSN XVII. PPSN 2022. Lecture Notes in Computer Science, vol 13398. Springer, Cham. https://doi.org/10.1007/978-3-031-14714-2_5
Download citation
DOI: https://doi.org/10.1007/978-3-031-14714-2_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-14713-5
Online ISBN: 978-3-031-14714-2
eBook Packages: Computer ScienceComputer Science (R0)