Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

A multiple surrogates based PSO algorithm

  • Published:
Artificial Intelligence Review Aims and scope Submit manuscript

Abstract

Particle swarm optimization (PSO) usually requires a large number of fitness evaluations to obtain a sufficiently good solution, which poses an obstacle for applying PSO to computationally expensive problems. In this paper, a multiple surrogates based PSO (MSPSO) framework is proposed, which consists of an inner loop optimization and an outer one. In the outer loop optimization, a PSO algorithm is used in both the optimization mode and the sampling one. In the inner loop optimization, a multiple surrogate based parallel optimization strategy is designed. Furthermore, the search history and the possible solutions from the outer loop optimization are provided for the inner one, and the result of the inner loop optimization is employed to guide the search of the outer one. To verify the performance of the proposed approach, a number of numerical experiments are conducted by using ten benchmark test functions and three time series regression modeling problems. The results indicate that the proposed framework is capable of converging to a good solution for the low-dimensional, non-convex and multimodal problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  • Bird S, Li X (2009) Improving local convergence in particle swarms by fitness approximation using regression. Adapt Learn Optim 2:265–293

    Article  Google Scholar 

  • Brochu E, Cora VM, Freitas ND (2010) A tutorial on Bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning. Department of Computer Science, University of British Columbia, Columbia

    Google Scholar 

  • Chaudhuri A, Haftka RT, Ifju P et al (2015) Experimental flapping wing optimization and uncertainty quantification using limited samples. Struct Multidiscip Optim 51(4):1–14

    Article  MathSciNet  Google Scholar 

  • Clerc M, Kennedy J (2002) The particle swarm-explosion, stability, and convergence in a multidimensional complex space. IEEE Trans Evol Comput 6(1):58–73

    Article  Google Scholar 

  • Eberhart RC, Kennedy J (1995) A new optimizer using particle swarm theory. Proc Sixth Int Symp Micro Mach Hum Sci 1:39–43

    Article  Google Scholar 

  • Ito K, Nakano R (2003) Optimizing support vector regression hyperparameters based on cross-validation. In: Proceedings of the international joint conference on neural networks. IEEE, 2077–2082

  • Jin Y (2005) A comprehensive survey of fitness approximation in evolutionary computation. Soft Comput Fusion Found Method Appl 9(1):3–12

    Google Scholar 

  • Jones DR, Schonlau M, Welch WJ (1998) Efficient global optimization of expensive black-box functions. J Glob Optim 13(4):455–492

    Article  MathSciNet  Google Scholar 

  • Liem RP, Mader CA, Martins JRRA (2015) Surrogate models and mixtures of experts in aerodynamic performance prediction for aircraft mission analysis. Aerosp Sci Technol 43(8):126–151

    Article  Google Scholar 

  • Molga M, Smutnicki C (2005) Test functions for optimization needs. Available at http://www.zsd.ict.pwr.wroc.pl/files/docs/functions.pdf

  • Ong YS, Nair PB, Keane AJ (2003) Evolutionary optimization of computationally expensive problems via surrogate modeling. AIAA J 41(4):687–696

    Article  Google Scholar 

  • Parno D, Hemker T, Fowler KR (2012) Applicability of surrogates to improve efficiency of particle swarm optimization for simulation-based problems. Eng Optim 44(5):521–535

    Article  Google Scholar 

  • Pilát M, Neruda R (2012) An evolutionary strategy for surrogate-based multiobjective optimization. In: 2012 IEEE congress on evolutionary computation (CEC’2012), Brisbane, Australia, 10–15 June 2012, IEEE Press, pp 866–872

  • Praveen C, Duvigneau R (2009) Low cost PSO using metamodels and inexact pre-evaluation: application to aerodynamic shape design. Comput Methods Appl Mech Eng 198(9–12):1087–1096

    Article  Google Scholar 

  • Rasmussen CE, Williams CKI (2006) Gaussian processes for machine learning. The MIT Press, Cambridge

    MATH  Google Scholar 

  • Regis RG (2014) Particle swarm with radial basis function surrogates for expensive black-box optimization. J Comput Sci 5(1):12–23

    Article  MathSciNet  Google Scholar 

  • Shahriari B, Swersky K, Wang Z et al (2016) Taking the human out of the loop: a review of Bayesian optimization. Proc IEEE 104(1):148–175

    Article  Google Scholar 

  • Snoek J, Larochelle H, Adams RP (2012) Practical Bayesian optimization of machine learning algorithms. Adv Neural Inf Process Syst 4:2951–2959

    Google Scholar 

  • Srinivas N, Krause A, Kakade S M, et al. (2009) Gaussian process optimization in the bandit setting: no regret and experimental design. In: International conference on machine learning. DBLP, pp 1015–1022

  • Sun C, Jin Y, Zeng J et al (2015) A two-layer surrogate-assisted particle swarm optimization algorithm. Soft Comput 19(6):1461–1475

    Article  Google Scholar 

  • Suykens JAK, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9(3):293–300

    Article  Google Scholar 

  • Tang Yuanfu, Chen Jianqiao, Wei Junhong (2013) A surrogate-based particle swarm optimization algorithm for solving optimization problems with expensive black box functions. Eng Optim 45(5):557–576

    Article  MathSciNet  Google Scholar 

  • Viana FAC, Haftka RT, Watson LT (2013) Efficient global optimization algorithm assisted by multiple surrogate techniques. J Glob Optim 56(2):669–689

    Article  Google Scholar 

  • Yang XS (2010) Engineering optimization: an introduction with metaheuristic applications. Wiley, Hoboken

    Book  Google Scholar 

  • Zhao J, Liu Q, Pedrycz W et al (2012) Effective noise estimation-based online prediction for byproduct gas system in steel industry. IEEE Trans Industr Inf 8(4):953–963

    Article  Google Scholar 

  • Zhou Z, Ong YS, Nair PB et al (2006) Combining global and local surrogate models to accelerate evolutionary optimization. IEEE Trans Syst Man Cybern Part C 37(1):66–76

    Article  Google Scholar 

  • Zhou Z, Ong YS, Meng HL et al (2007) Memetic algorithm using multi-surrogates for computationally expensive optimization problems. Soft Comput Fusion Found Method Appl 11(10):957–971

    Google Scholar 

Download references

Acknowledgements

This work is supported by the National Natural Sciences Foundation of China (Nos. 61473056, 61533005, 61522304, U1560102), National Sci-Tech Support Plan (No. 2015BAF22B01) and Fundamental Research Funds for the Central Universities (DUT17ZD231).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jun Zhao.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lv, Z., Zhao, J., Wang, W. et al. A multiple surrogates based PSO algorithm. Artif Intell Rev 52, 2169–2190 (2019). https://doi.org/10.1007/s10462-017-9601-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10462-017-9601-3

Keywords

Navigation