Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

AVEI-BO: an efficient Bayesian optimization using adaptively varied expected improvement

  • Research Paper
  • Published:
Structural and Multidisciplinary Optimization Aims and scope Submit manuscript

Abstract

Bayesian optimization (BO) finds the optimum by iteratively updating a surrogate model to approximate the complex objective function and constructing an acquisition function to guide sequential sampling. As a widely recognized acquisition function, expected improvement (EI) has played an essential role in BO. However, it may fall into “over-exploitation” or “over-exploration” for complex multi-modal problems. To overcome the shortcomings, this paper proposes a novel acquisition function named adaptively varied expected improvement (AVEI), which is essentially the expected value of a newly defined target improvement function. The core idea of AVEI is: (i) a weight coefficient and a target parameter (difference between the present optimal value and a desired improvement) are innovatively introduced to intelligently control the precise balance of local exploitation and global exploration; (ii) a real-time performance monitoring parameter (ratio of the actual improvement to the desired improvement) and a convenient control law are proposed to update the weight coefficient and target parameter adaptively. An efficient Bayesian optimization called AVEI-BO is thus developed using the Gaussian process (GP) and AVEI. The traditional EI-BO is also constructed using GP and EI for comparison. The performance of AVEI-BO and EI-BO is first evaluated by using eight benchmark problems. Extended discussions are then conducted by comparing AVEI-BO with five classical acquisition functions. The results show that AVEI-BO possesses advantages in optimization capability, convergence speed, and numerical robustness. Moreover, a typical design optimization task of a turbine disk is employed to investigate the engineering applicability of AVEI-BO. The results prove that AVEI-BO is superior.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  • Berk J, Nguyen V, Gupta S, Rana S, Venkatesh S (2018) Exploration enhanced expected improvement for Bayesian optimization. In: Joint European conference on machine learning and knowledge discovery in databases. Springer, pp 621–637

  • Bhosekar A, Ierapetritou M (2018) Advances in surrogate based modeling, feasibility analysis, and optimization: a review. Comput Chem Eng 108:250–267

    Article  Google Scholar 

  • Breiman L (2001) Random forests. Mach Learn 45(1):5–32

    Article  Google Scholar 

  • Brochu E, Cora VM, De Freitas N (2010) A tutorial on Bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning. Preprint. https://arxiv.org/abs/1012.2599

  • Bull AD (2011) Convergence rates of efficient global optimization algorithms. J Mach Learn Res 12(10):2879–2904

    MathSciNet  MATH  Google Scholar 

  • Cai X, Qiu H, Gao L, Yang P, Shao X (2017) A multi-point sampling method based on kriging for global optimization. Struct Multidisc Optim 56(1):71–88

    Article  MathSciNet  Google Scholar 

  • Chen Z, Mak S, Wu C (2019) A hierarchical expected improvement method for Bayesian optimization. arXiv preprint. arXiv:191107285

  • Chung IB, Park D, Choi DH (2018) Surrogate-based global optimization using an adaptive switching infill sampling criterion for expensive black-box functions. Struct Multidisc Optim 57(4):1443–1459

    Article  MathSciNet  Google Scholar 

  • Contal E, Buffoni D, Robicquet A, Vayatis N (2013) Parallel Gaussian process optimization with upper confidence bound and pure exploration. In: Joint European conference on machine learning and knowledge discovery in databases. Springer, pp 225–240

  • Eriksson D, Pearce M, Gardner J, Turner RD, Poloczek M (2019) Scalable global optimization via local Bayesian optimization. Adv Neural Inf Process Syst 32:5497–5508

    Google Scholar 

  • Feng Z, Zhang Q, Zhang Q, Tang Q, Yang T, Ma Y (2015) A multiobjective optimization based framework to balance the global exploration and local exploitation in expensive optimization. J Glob Optim 61(4):677–694

    Article  MathSciNet  Google Scholar 

  • Frazier PI (2018) A tutorial on Bayesian optimization. Preprint. https://arxiv.org/abs/1807.02811

  • Ghahramani Z (2015) Probabilistic machine learning and artificial intelligence. Nature 521(7553):452–459

    Article  Google Scholar 

  • Ghoreishi SF, Allaire D (2019) Multi-information source constrained Bayesian optimization. Struct Multidisc Optim 59(3):977–991

    Article  MathSciNet  Google Scholar 

  • Guo Z, Ong YS, Liu H (2021) Calibrated and recalibrated expected improvements for Bayesian optimization. Struct Multidisc Optim 64:1–19

    Article  MathSciNet  Google Scholar 

  • Han Z (2016) Kriging surrogate model and its application to design optimization: a review of recent progress. Acta Aeronaut Astronaut Sin 37(11):3197–3225

    Google Scholar 

  • Huang Z, Wang C, Chen J, Tian H (2011) Optimal design of aeroengine turbine disc based on kriging surrogate models. Comput Struct 89(1–2):27–37

    Article  Google Scholar 

  • Jones DR, Schonlau M, Welch WJ (1998) Efficient global optimization of expensive black-box functions. J Glob Optim 13(4):455–492

    Article  MathSciNet  Google Scholar 

  • Kushner HJ (1964) A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise. J Basic Eng 86(1):97–106

    Article  Google Scholar 

  • Liu J, Han Z, Song W (2012) Comparison of infill sampling criteria in kriging-based aerodynamic optimization. In: 28th Congress of the International Council of the Aeronautical Sciences, pp 23–28

  • Liu H, Cai J, Ong YS (2018) Remarks on multi-output Gaussian process regression. Knowl Based Syst 144:102–121

    Article  Google Scholar 

  • Lizotte DJ (2008) Practical Bayesian optimization. Dissertation, University of Alberta

  • Lu S, Li LW (2011) Twin-web structure optimization design for heavy duty turbine disk based for aero-engine. J Propuls Technol 32(5):631–636

    MathSciNet  Google Scholar 

  • Mockus J, Tiesis V, Zilinskas A (1978) The application of Bayesian methods for seeking the extremum. Towards Glob Optim 2(2):117–129

    MATH  Google Scholar 

  • Muniyappan S, Rajendran P (2019) Contrast enhancement of medical images through adaptive genetic algorithm (AGA) over genetic algorithm (GA) and particle swarm optimization (PSO). Multimed Tools Appl 78(6):6487–6511

    Article  Google Scholar 

  • Nelder JA, Wedderburn RW (1972) Generalized linear models. J R Stat Soc A 135(3):370–384

    Article  Google Scholar 

  • Press WH, Teukolsky SA, Vetterling WT, Flannery BP (2007) Numerical recipes with source code CD-ROM: the art of scientific computing, 3rd edn. Cambridge University Press, Cambridge

    MATH  Google Scholar 

  • Qin C, Klabjan D, Russo D (2017) Improving the expected improvement algorithm. arXiv preprint. arXiv:170510033

  • Rasmussen CE, Williams CKI (2006) Gaussian processes for machine learning. MIT Press, Cambridge

    MATH  Google Scholar 

  • Ruan X, Jiang P, Zhou Q, Hu J, Shu L (2020) Variable-fidelity probability of improvement method for efficient global optimization of expensive black-box problems. Struct Multidisc Optim 62(6):3021–3052

    Article  MathSciNet  Google Scholar 

  • Scott SL (2010) A modern Bayesian look at the multi-armed bandit. Appl Stoch Models Bus Ind 26(6):639–658

    Article  MathSciNet  Google Scholar 

  • Shahriari B, Wang Z, Hoffman MW, Bouchard-Côté A, de Freitas N (2014) An entropy search portfolio for Bayesian optimization. Preprint. https://arxiv.org/abs/1406.4625

  • Shahriari B, Swersky K, Wang Z, Adams RP, De Freitas N (2015) Taking the human out of the loop: a review of Bayesian optimization. Proc IEEE 104(1):148–175

    Article  Google Scholar 

  • Shende S, Gillman A, Yoo D, Buskohl P, Vemaganti K (2021) Bayesian topology optimization for efficient design of origami folding structures. Struct Multidisc Optim 63(4):1907–1926

    Article  MathSciNet  Google Scholar 

  • Snoek J, Rippel O, Swersky K, Kiros R, Satish N, Sundaram N, Patwary M, Prabhat M, Adams R (2015) Scalable Bayesian optimization using deep neural networks. In: 32nd International conference on machine learning, pp 2171–2180

  • Sóbester A, Leary SJ, Keane AJ (2005) On the design of optimization strategies based on global response surface approximation models. J Glob Optim 33(1):31–59

    Article  MathSciNet  Google Scholar 

  • Springenberg JT, Klein A, Falkner S, Hutter F (2016) Bayesian optimization with robust Bayesian neural networks. Adv Neural Inf Process Syst 29:4134–4142

    Google Scholar 

  • Tran A, Tran M, Wang Y (2019) Constrained mixed-integer Gaussian mixture Bayesian optimization and its applications in designing fractal and auxetic metamaterials. Struct Multidisc Optim 59(6):2131–2154

    Article  MathSciNet  Google Scholar 

  • Viana FA, Haftka RT, Watson LT (2013) Efficient global optimization algorithm assisted by multiple surrogate techniques. J Glob Optim 56(2):669–689

    Article  Google Scholar 

  • Wang J, Clark SC, Liu E, Frazier PI (2020) Parallel Bayesian global optimization of expensive functions. Oper Res 68(6):1850–1865

    Article  MathSciNet  Google Scholar 

  • Xiao S, Rotaru M, Sykulski JK (2012) Exploration versus exploitation using kriging surrogate modelling in electromagnetic design. COMPEL Int J Comput Math Electr Electron Eng 31(5):1541–1551

    Article  Google Scholar 

  • Xu Z, Guo Y, Saleh JH (2021) Efficient hybrid Bayesian optimization algorithm with adaptive expected improvement acquisition function. Eng Optim 53(10):1786–1804

    Article  MathSciNet  Google Scholar 

  • Yan C, Shen X, Guo F, Zhao S, Zhang L (2019) A novel model modification method for support vector regression based on radial basis functions. Struct Multidisc Optim 60(3):983–997

    Article  MathSciNet  Google Scholar 

  • Zhan D, Xing H (2020) Expected improvement for expensive optimization: a review. J Glob Optim 78(3):507–544

    Article  MathSciNet  Google Scholar 

  • Zhang Y, Han ZH, Zhang KS (2018) Variable-fidelity expected improvement method for efficient global optimization of expensive functions. Struct Multidisc Optim 58(4):1431–1451

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The authors would like to thank the anonymous reviewers for their valuable comments.

Funding

This study was co-supported by the National Natural Science Foundation of China (No. 52005421), the Natural Science Foundation of Fujian Province of China (No. 2020J05020), the National Science and Technology Major Project (No. J2019-I-0013-0013), the Fundamental Research Funds for the Central Universities (No. 20720210090), and the Project Funded by the China Postdoctoral Science Foundation (Nos. 2020M682584, 2021T140634).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to He Liu.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Replication of results

The datasets generated and analyzed during the current study are available from the corresponding author on reasonable request.

Additional information

Responsible Editor: Nestor V Queipo

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yan, C., Du, H., Kang, E. et al. AVEI-BO: an efficient Bayesian optimization using adaptively varied expected improvement. Struct Multidisc Optim 65, 164 (2022). https://doi.org/10.1007/s00158-022-03256-3

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s00158-022-03256-3

Keywords

Navigation