Nothing Special   »   [go: up one dir, main page]

Skip to main content

Pareto Front Upconvert by Iterative Estimation Modeling and Solution Sampling

  • Conference paper
  • First Online:
Evolutionary Multi-Criterion Optimization (EMO 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13970))

Included in the following conference series:

  • 1041 Accesses

Abstract

For an efficient upconvert of the Pareto front resolution by utilizing a known candidate solution set, this paper proposed an algorithm that built the Pareto front and the Pareto set estimation models and repeated to sample a solution from them, evaluate it, and updated the estimation models with it. Conventional supervised multi-objective optimization algorithm (SMOA) built the Pareto front and the Pareto set estimation models with a known candidate solution set. SMOA sampled a set of well-distributed estimated points and evaluated them to upconvert the Pareto front resolution. However, depending on the distribution of the known candidate solutions, we could not expect the accuracy of the estimation models and the estimated points from them. The proposed method, the iterative SMOA (I-SMOA), gradually improved the accuracy of the estimation models through their iterative update with evaluated solutions. Experimental results on the DTLZ2 test problem showed that the proposed I-SMOA obtained solutions uniformly distributed more than the one by the conventional SMOA, and the proposed I-SMOA achieved higher robustness on the initially given candidate solutions.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Chugh, T., Sindhya, K., Hakanen, J., Miettinen, K.: A survey on handling computationally expensive multiobjective optimization problems with evolutionary algorithms. Soft. Comput. 23(9), 3137–3166 (2019)

    Article  Google Scholar 

  2. Simon, D.: Evolutionary Optimization Algorithms Biologically-Inspired and Population-Based Approaches to Computer Intelligence, chap. 8.1 Initialization, p. 180. Wiley (2013)

    Google Scholar 

  3. Takagi, T., Takadama, K., Sato, H.: Supervised multi-objective optimization algorithm using estimation. In: 2022 IEEE Congress on Evolutionary Computation (CEC 2022), pp. 1–8 (2022)

    Google Scholar 

  4. Stein, M.L.: Interpolation of Spatial Data: Some Theory for Kriging. Springer, Heidelberg (2012)

    Google Scholar 

  5. Bishop, C.M., et al.: Neural Networks for Pattern Recognition. Oxford University Press, Oxford (1995)

    MATH  Google Scholar 

  6. Liu, J.: Radial Basis Function (RBF) Neural Network Control for Mechanical Systems: Design, Analysis and Matlab Simulation. Springer, Heidelberg (2013)

    Book  MATH  Google Scholar 

  7. Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable test problems for evolutionary multiobjective optimization. In: Abraham, A., Jain, L., Goldberg, R. (eds.) Evolutionary Multiobjective Optimization, pp. 105–145. Springer, London (2005). https://doi.org/10.1007/1-84628-137-7_6

    Chapter  MATH  Google Scholar 

  8. Hartikainen, M., Miettinen, K., Wiecek, M.M.: PAINT: Pareto front interpolation for nonlinear multiobjective optimization. Comput. Optim. Appl. 52(3), 845–867 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  9. Kumar Singh, H., Shankar Bhattacharjee, K., Ray, T.: A projection-based approach for constructing piecewise linear Pareto front approximations. J. Mech. Des. 138(9), 091404 (2016)

    Article  Google Scholar 

  10. Takagi, T., Takadama, K., Sato, H.: Pareto front estimation using unit hyperplane. In: Ishibuchi, H., et al. (eds.) EMO 2021. LNCS, vol. 12654, pp. 126–138. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-72062-9_11

    Chapter  Google Scholar 

  11. Kobayashi, K., Hamada, N., Sannai, A., Tanaka, A., Bannai, K., Sugiyama, M.: Bézier simplex fitting: describing Pareto fronts of simplicial problems with small samples in multi-objective optimization. In: AAAI Conference on Artificial Intelligence, vol. 33, pp. 2304–2313 (2019)

    Google Scholar 

  12. Tian, Y., Si, L., Zhang, X., Tan, K.C., Jin, Y.: Local model-based Pareto front estimation for multiobjective optimization. IEEE Trans. Syst. Man Cybern. Syst. 53, 623–634 (2022)

    Article  Google Scholar 

  13. Zapotecas Martínez, S., Sosa Hernández, V.A., Aguirre, H., Tanaka, K., Coello Coello, C.A.: Using a family of curves to approximate the Pareto front of a multi-objective optimization problem. In: Bartz-Beielstein, T., Branke, J., Filipič, B., Smith, J. (eds.) PPSN 2014. LNCS, vol. 8672, pp. 682–691. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-10762-2_67

    Chapter  Google Scholar 

  14. Zhou, A., Zhang, Q., Jin, Y.: Approximating the set of Pareto-optimal solutions in both the decision and objective spaces by an estimation of distribution algorithm. IEEE Trans. Evol. Comput. 13(5), 1167–1189 (2009)

    Article  Google Scholar 

  15. Giagkiozis, I., Fleming, P.J.: Pareto front estimation for decision making. Evol. Comput. 22(4), 651–678 (2014)

    Article  Google Scholar 

  16. Takagi, T., Takadama, K., Sato, H.: Incremental lattice design of weight vector set. In: 2020 Genetic and Evolutionary Computation Conference (GECCO 2020), pp. 1486–1494 (2020)

    Google Scholar 

  17. Molinet Berenguer, J.A., Coello Coello, C.A.: Evolutionary many-objective optimization based on Kuhn-Munkres’ algorithm. In: Gaspar-Cunha, A., Henggeler Antunes, C., Coello, C.C. (eds.) EMO 2015. LNCS, vol. 9019, pp. 3–17. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-15892-1_1

    Chapter  MATH  Google Scholar 

  18. Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Stat. Sci. 4, 409–423 (1989)

    MathSciNet  MATH  Google Scholar 

  19. Kim, P.: Matlab Deep Learning: With Machine Learning, Neural Networks and Artificial Intelligence, vol. 130, no. 21 (2017)

    Google Scholar 

  20. Zitzler, E., Thiele, L.: Multiobjective evolutionary algorithms: a comparative case study and the strength pareto approach. IEEE Trans. Evol. Comput. 3(4), 257–271 (1999)

    Article  Google Scholar 

  21. Coello, C.A.C., Cortés, N.C.: Solving multiobjective optimization problems using an artificial immune system. Genet. Program Evolvable Mach. 6(2), 163–190 (2005)

    Article  Google Scholar 

  22. Tian, Y., Cheng, R., Zhang, X., Jin, Y.: Platemo: a matlab platform for evolutionary multi-objective optimization [educational forum]. IEEE Comput. Intell. Mag. 12(4), 73–87 (2017)

    Article  Google Scholar 

  23. Das, I., Dennis, J.E.: Normal-boundary intersection: a new method for generating the pareto surface in nonlinear multicriteria optimization problems. SIAM J. Optim. 8(3), 631–657 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  24. Farias, L.R., Araújo, A.F.: IM-MOEA/D: an inverse modeling multi-objective evolutionary algorithm based on decomposition. In: 2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC 2021), pp. 462–467. IEEE (2021)

    Google Scholar 

  25. Cheng, R., Jin, Y., Narukawa, K.: Adaptive reference vector generation for inverse model based evolutionary multiobjective optimization with degenerate and disconnected pareto fronts. In: Gaspar-Cunha, A., Henggeler Antunes, C., Coello, C.C. (eds.) EMO 2015. LNCS, vol. 9018, pp. 127–140. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-15934-8_9

    Chapter  Google Scholar 

Download references

Acknowledgment

This work was supported by JSPS KAKENHI Grant Number 22H03660.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hiroyuki Sato .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Takagi, T., Takadama, K., Sato, H. (2023). Pareto Front Upconvert by Iterative Estimation Modeling and Solution Sampling. In: Emmerich, M., et al. Evolutionary Multi-Criterion Optimization. EMO 2023. Lecture Notes in Computer Science, vol 13970. Springer, Cham. https://doi.org/10.1007/978-3-031-27250-9_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-27250-9_16

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-27249-3

  • Online ISBN: 978-3-031-27250-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics