Abstract
Bayesian optimization (BO) finds the optimum by iteratively updating a surrogate model to approximate the complex objective function and constructing an acquisition function to guide sequential sampling. As a widely recognized acquisition function, expected improvement (EI) has played an essential role in BO. However, it may fall into “over-exploitation” or “over-exploration” for complex multi-modal problems. To overcome the shortcomings, this paper proposes a novel acquisition function named adaptively varied expected improvement (AVEI), which is essentially the expected value of a newly defined target improvement function. The core idea of AVEI is: (i) a weight coefficient and a target parameter (difference between the present optimal value and a desired improvement) are innovatively introduced to intelligently control the precise balance of local exploitation and global exploration; (ii) a real-time performance monitoring parameter (ratio of the actual improvement to the desired improvement) and a convenient control law are proposed to update the weight coefficient and target parameter adaptively. An efficient Bayesian optimization called AVEI-BO is thus developed using the Gaussian process (GP) and AVEI. The traditional EI-BO is also constructed using GP and EI for comparison. The performance of AVEI-BO and EI-BO is first evaluated by using eight benchmark problems. Extended discussions are then conducted by comparing AVEI-BO with five classical acquisition functions. The results show that AVEI-BO possesses advantages in optimization capability, convergence speed, and numerical robustness. Moreover, a typical design optimization task of a turbine disk is employed to investigate the engineering applicability of AVEI-BO. The results prove that AVEI-BO is superior.
Similar content being viewed by others
References
Berk J, Nguyen V, Gupta S, Rana S, Venkatesh S (2018) Exploration enhanced expected improvement for Bayesian optimization. In: Joint European conference on machine learning and knowledge discovery in databases. Springer, pp 621–637
Bhosekar A, Ierapetritou M (2018) Advances in surrogate based modeling, feasibility analysis, and optimization: a review. Comput Chem Eng 108:250–267
Breiman L (2001) Random forests. Mach Learn 45(1):5–32
Brochu E, Cora VM, De Freitas N (2010) A tutorial on Bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning. Preprint. https://arxiv.org/abs/1012.2599
Bull AD (2011) Convergence rates of efficient global optimization algorithms. J Mach Learn Res 12(10):2879–2904
Cai X, Qiu H, Gao L, Yang P, Shao X (2017) A multi-point sampling method based on kriging for global optimization. Struct Multidisc Optim 56(1):71–88
Chen Z, Mak S, Wu C (2019) A hierarchical expected improvement method for Bayesian optimization. arXiv preprint. arXiv:191107285
Chung IB, Park D, Choi DH (2018) Surrogate-based global optimization using an adaptive switching infill sampling criterion for expensive black-box functions. Struct Multidisc Optim 57(4):1443–1459
Contal E, Buffoni D, Robicquet A, Vayatis N (2013) Parallel Gaussian process optimization with upper confidence bound and pure exploration. In: Joint European conference on machine learning and knowledge discovery in databases. Springer, pp 225–240
Eriksson D, Pearce M, Gardner J, Turner RD, Poloczek M (2019) Scalable global optimization via local Bayesian optimization. Adv Neural Inf Process Syst 32:5497–5508
Feng Z, Zhang Q, Zhang Q, Tang Q, Yang T, Ma Y (2015) A multiobjective optimization based framework to balance the global exploration and local exploitation in expensive optimization. J Glob Optim 61(4):677–694
Frazier PI (2018) A tutorial on Bayesian optimization. Preprint. https://arxiv.org/abs/1807.02811
Ghahramani Z (2015) Probabilistic machine learning and artificial intelligence. Nature 521(7553):452–459
Ghoreishi SF, Allaire D (2019) Multi-information source constrained Bayesian optimization. Struct Multidisc Optim 59(3):977–991
Guo Z, Ong YS, Liu H (2021) Calibrated and recalibrated expected improvements for Bayesian optimization. Struct Multidisc Optim 64:1–19
Han Z (2016) Kriging surrogate model and its application to design optimization: a review of recent progress. Acta Aeronaut Astronaut Sin 37(11):3197–3225
Huang Z, Wang C, Chen J, Tian H (2011) Optimal design of aeroengine turbine disc based on kriging surrogate models. Comput Struct 89(1–2):27–37
Jones DR, Schonlau M, Welch WJ (1998) Efficient global optimization of expensive black-box functions. J Glob Optim 13(4):455–492
Kushner HJ (1964) A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise. J Basic Eng 86(1):97–106
Liu J, Han Z, Song W (2012) Comparison of infill sampling criteria in kriging-based aerodynamic optimization. In: 28th Congress of the International Council of the Aeronautical Sciences, pp 23–28
Liu H, Cai J, Ong YS (2018) Remarks on multi-output Gaussian process regression. Knowl Based Syst 144:102–121
Lizotte DJ (2008) Practical Bayesian optimization. Dissertation, University of Alberta
Lu S, Li LW (2011) Twin-web structure optimization design for heavy duty turbine disk based for aero-engine. J Propuls Technol 32(5):631–636
Mockus J, Tiesis V, Zilinskas A (1978) The application of Bayesian methods for seeking the extremum. Towards Glob Optim 2(2):117–129
Muniyappan S, Rajendran P (2019) Contrast enhancement of medical images through adaptive genetic algorithm (AGA) over genetic algorithm (GA) and particle swarm optimization (PSO). Multimed Tools Appl 78(6):6487–6511
Nelder JA, Wedderburn RW (1972) Generalized linear models. J R Stat Soc A 135(3):370–384
Press WH, Teukolsky SA, Vetterling WT, Flannery BP (2007) Numerical recipes with source code CD-ROM: the art of scientific computing, 3rd edn. Cambridge University Press, Cambridge
Qin C, Klabjan D, Russo D (2017) Improving the expected improvement algorithm. arXiv preprint. arXiv:170510033
Rasmussen CE, Williams CKI (2006) Gaussian processes for machine learning. MIT Press, Cambridge
Ruan X, Jiang P, Zhou Q, Hu J, Shu L (2020) Variable-fidelity probability of improvement method for efficient global optimization of expensive black-box problems. Struct Multidisc Optim 62(6):3021–3052
Scott SL (2010) A modern Bayesian look at the multi-armed bandit. Appl Stoch Models Bus Ind 26(6):639–658
Shahriari B, Wang Z, Hoffman MW, Bouchard-Côté A, de Freitas N (2014) An entropy search portfolio for Bayesian optimization. Preprint. https://arxiv.org/abs/1406.4625
Shahriari B, Swersky K, Wang Z, Adams RP, De Freitas N (2015) Taking the human out of the loop: a review of Bayesian optimization. Proc IEEE 104(1):148–175
Shende S, Gillman A, Yoo D, Buskohl P, Vemaganti K (2021) Bayesian topology optimization for efficient design of origami folding structures. Struct Multidisc Optim 63(4):1907–1926
Snoek J, Rippel O, Swersky K, Kiros R, Satish N, Sundaram N, Patwary M, Prabhat M, Adams R (2015) Scalable Bayesian optimization using deep neural networks. In: 32nd International conference on machine learning, pp 2171–2180
Sóbester A, Leary SJ, Keane AJ (2005) On the design of optimization strategies based on global response surface approximation models. J Glob Optim 33(1):31–59
Springenberg JT, Klein A, Falkner S, Hutter F (2016) Bayesian optimization with robust Bayesian neural networks. Adv Neural Inf Process Syst 29:4134–4142
Tran A, Tran M, Wang Y (2019) Constrained mixed-integer Gaussian mixture Bayesian optimization and its applications in designing fractal and auxetic metamaterials. Struct Multidisc Optim 59(6):2131–2154
Viana FA, Haftka RT, Watson LT (2013) Efficient global optimization algorithm assisted by multiple surrogate techniques. J Glob Optim 56(2):669–689
Wang J, Clark SC, Liu E, Frazier PI (2020) Parallel Bayesian global optimization of expensive functions. Oper Res 68(6):1850–1865
Xiao S, Rotaru M, Sykulski JK (2012) Exploration versus exploitation using kriging surrogate modelling in electromagnetic design. COMPEL Int J Comput Math Electr Electron Eng 31(5):1541–1551
Xu Z, Guo Y, Saleh JH (2021) Efficient hybrid Bayesian optimization algorithm with adaptive expected improvement acquisition function. Eng Optim 53(10):1786–1804
Yan C, Shen X, Guo F, Zhao S, Zhang L (2019) A novel model modification method for support vector regression based on radial basis functions. Struct Multidisc Optim 60(3):983–997
Zhan D, Xing H (2020) Expected improvement for expensive optimization: a review. J Glob Optim 78(3):507–544
Zhang Y, Han ZH, Zhang KS (2018) Variable-fidelity expected improvement method for efficient global optimization of expensive functions. Struct Multidisc Optim 58(4):1431–1451
Acknowledgements
The authors would like to thank the anonymous reviewers for their valuable comments.
Funding
This study was co-supported by the National Natural Science Foundation of China (No. 52005421), the Natural Science Foundation of Fujian Province of China (No. 2020J05020), the National Science and Technology Major Project (No. J2019-I-0013-0013), the Fundamental Research Funds for the Central Universities (No. 20720210090), and the Project Funded by the China Postdoctoral Science Foundation (Nos. 2020M682584, 2021T140634).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Replication of results
The datasets generated and analyzed during the current study are available from the corresponding author on reasonable request.
Additional information
Responsible Editor: Nestor V Queipo
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Yan, C., Du, H., Kang, E. et al. AVEI-BO: an efficient Bayesian optimization using adaptively varied expected improvement. Struct Multidisc Optim 65, 164 (2022). https://doi.org/10.1007/s00158-022-03256-3
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s00158-022-03256-3