Nothing Special   »   [go: up one dir, main page]

Skip to main content

Two-Stage Greedy Approximated Hypervolume Subset Selection for Large-Scale Problems

  • Conference paper
  • First Online:
Evolutionary Multi-Criterion Optimization (EMO 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13970))

Included in the following conference series:

  • 1131 Accesses

Abstract

Recently, it has been demonstrated that a solution set that is better than the final population can be obtained by subset selection in some studies on evolutionary multi-objective optimization. The main challenge in this type of subset selection is how to efficiently handle a huge candidate solution set, especially when the hypervolume-based subset selection is used for many-objective optimization. In this paper, we propose an efficient two-stage greedy algorithm for hypervolume-based subset selection. In each iteration of the proposed greedy algorithm, a small number of promising candidate solutions are selected in the first stage using the rough hypervolume contribution approximation. In the second stage, a single solution among them is selected using the more precise approximation. Experimental results show that the proposed algorithm is much faster than state-of-the-art hypervolume-based greedy subset selection algorithms at the cost of a slight deterioration of the selected subset quality.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://github.com/HisaoLabSUSTC/BenchSS.

References

  1. Ishibuchi, H., Pang, L.M., Shang, K.: A new framework of evolutionary multi-objective algorithms with an unbounded external archive. In: Proceedings of the European Conference on Artificial Intelligence, Santiago de Compostela, Spain, pp. 283–290 (2020)

    Google Scholar 

  2. Fieldsend, J.E., Everson, R.M., Singh, S.: Using unconstrained elite archives for multiobjective optimization. IEEE Trans. Evol. Comput. 7(3), 305–323 (2003)

    Article  Google Scholar 

  3. Schütze, O., Coello Coello, C.A., Mostaghim, S., Talbi, E.-G., Dellnitz, M.: Hybridizing evolutionary strategies with continuation methods for solving multi-objective problems. J. Eng. Optim. 40(5), 383–402 (2008)

    Article  MathSciNet  Google Scholar 

  4. Brockhoff, D., Tran, T.-D., Hansen, N.: Benchmarking numerical multiobjective optimizers revisited. In: Proceedings of the Conference on Genetic and Evolutionary Computation, Madrid Spain, pp. 639–646 (2015)

    Google Scholar 

  5. Tanabe, R., Ishibuchi, H., Oyama, A.: Benchmarking multi- and many-objective evolutionary algorithms under two optimization scenarios. IEEE Access 5, 19597–19619 (2017)

    Article  Google Scholar 

  6. Ishibuchi, H., Setoguchi, Y., Masuda, H., Nojima, Y.: How to compare many-objective algorithms under different settings of population and archive sizes. In: Proceedings of the IEEE Congress on Evolutionary Computation, Vancouver, BC, Canada, pp. 1149–1156 (2016)

    Google Scholar 

  7. Ishibuchi, H., Pang, L.M., Shang, K.: Difficulties in fair performance comparison of multi-objective evolutionary algorithms. IEEE Comput. Intell. Mag. 17(1), 86–101 (2022)

    Article  Google Scholar 

  8. Bringmann, K., Friedrich, T., Klitzke, P.: Generic postprocessing via subset selection for hypervolume and epsilon-indicator. In: Bartz-Beielstein, T., Branke, J., Filipič, B., Smith, J. (eds.) PPSN 2014. LNCS, vol. 8672, pp. 518–527. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-10762-2_51

    Chapter  Google Scholar 

  9. Bezerra, L.C.T., López-Ibáñez, M., Stützle, T.: Archiver effects on the performance of state-of-the-art multi- and many-objective evolutionary algorithms. In: Proceedings of the Conference on Genetic and Evolutionary Computation, Prague Czech Republic, pp. 620–628 (2019)

    Google Scholar 

  10. Li, M., Yao, X.: An empirical investigation of the optimality and monotonicity properties of multiobjective archiving methods. In: Deb, K., et al. (eds.) EMO 2019. LNCS, vol. 11411, pp. 15–26. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-12598-1_2

    Chapter  Google Scholar 

  11. Zitzler, E., Thiele, L.: Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach. IEEE Trans. Evol. Comput. 3(4), 257–271 (1999)

    Article  Google Scholar 

  12. Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C.M., da Fonseca, V.G.: Performance assessment of multiobjective optimizers: an analysis and review. IEEE Trans. Evol. Comput. 7(2), 117–132 (2003)

    Article  Google Scholar 

  13. Shang, K., Ishibuchi, H., He, L., Pang, L.M.: A survey on the hypervolume indicator in evolutionary multiobjective optimization. IEEE Trans. Evol. Comput. 25(1), 1–20 (2021)

    Article  Google Scholar 

  14. Bradstreet, L., Barone, L., While, L.: Maximising hypervolume for selection in multi-objective evolutionary algorithms. In: Proceedings of the IEEE Congress on Evolutionary Computation, Vancouver, BC, Canada, pp. 1744–1751 (2006)

    Google Scholar 

  15. Bradstreet, L., While, L., Barone, L.: Incrementally maximising hypervolume for selection in multi-objective evolutionary algorithms. In: Proceedings of the IEEE Congress on Evolutionary Computation, Singapore, pp. 3203–3210 (2007)

    Google Scholar 

  16. Jiang, S., Zhang, J., Ong, Y., Zhang, A.N., Tan, P.S.: A simple and fast hypervolume indicator-based multiobjective evolutionary algorithm. IEEE Trans. Cybern. 45(10), 2202–2213 (2015)

    Article  Google Scholar 

  17. Chen, W., Ishibuchi, H., Shang, K.: Fast greedy subset selection from large candidate solution sets in evolutionary multi-objective optimization. IEEE Trans. Evol. Comput. 26(4), 750–764 (2022)

    Article  Google Scholar 

  18. Shang, K., Ishibuchi, H., Chen, W.: Greedy approximated hypervolume subset selection for many-objective optimization. In: Proceedings of the Genetic and Evolutionary Computation Conference, Lille, France, pp. 448–456 (2021)

    Google Scholar 

  19. Nemhauser, G.L., Wolsey, L.A., Fisher, M.L.: An analysis of approximations for maximizing submodular set functions-I. Math. Program. 14(1), 265–294 (1978)

    Article  MathSciNet  MATH  Google Scholar 

  20. Ulrich, T., Thiele, L.: Bounding the effectiveness of hypervolume-based (\(\mu +\lambda \))-archiving algorithms. In: Hamadi, Y., Schoenauer, M. (eds.) LION 2012. LNCS, pp. 235–249. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-34413-8_17

    Chapter  Google Scholar 

  21. Shang, K., Ishibuchi, H., Ni, X.: R2-based hypervolume contribution approximation. IEEE Trans. Evol. Comput. 24(1), 185–192 (2020)

    Article  Google Scholar 

  22. Nan, Y., Shang, K., Ishibuchi, H., He, L.: Improving local search hypervolume subset selection in evolutionary multi-objective optimization. In: Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, Melbourne, Australia, pp. 751–757 (2021)

    Google Scholar 

  23. Nan, Y., Shang, K., Ishibuchi, H., He, L.: A two-stage hypervolume contribution approximation method based on R2 indicator. In: Proceedings of the IEEE Congress on Evolutionary Computation, Kraków, Poland, pp. 2468–2475 (2021)

    Google Scholar 

  24. Shang, K., Shu, T., Ishibuchi, H., Nan, Y., Pang, L.M.: Benchmarking subset selection from large candidate solution sets in evolutionary multi-objective optimization. arXiv:2201.06700

  25. Deb, K., Jain, H.: An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, Part I: solving problems with box constraints. IEEE Trans. Evol. Comput. 18(4), 577–601 (2013)

    Article  Google Scholar 

  26. Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable test problems for evolutionary multiobjective optimization. In: Abraham, A., Jain, L., Goldberg, R. (eds.) Evolutionary Multiobjective Optimization. AI &KP, pp. 105–145. Springer, London (2005). https://doi.org/10.1007/1-84628-137-7_6

    Chapter  MATH  Google Scholar 

  27. Ishibuchi, H., Setoguchi, Y., Masuda, H., Nojima, Y.: Performance of decomposition-based many-objective algorithms strongly depends on Pareto front shapes. IEEE Trans. Evol. Comput. 21(2), 169–190 (2017)

    Article  Google Scholar 

  28. Ishibuchi, H., Imada, R., Setoguchi, Y., Nojima, Y.: How to specify a reference point in hypervolume calculation for fair performance comparison. Evol. Comput. 26(3), 411–440 (2018)

    Article  Google Scholar 

  29. Nan, Y., Shang, K., Ishibuchi, H.: What is a good direction vector set for the R2-based hypervolume contribution approximation. In: Proceedings of the Conference on Genetic and Evolutionary Computation, Lisbon, Portugal, pp. 524–532 (2020)

    Google Scholar 

  30. Deng, J., Zhang, Q.: Approximating hypervolume and hypervolume contributions using polar coordinate. IEEE Trans. Evol. Comput. 23(5), 913–918 (2019)

    Article  MathSciNet  Google Scholar 

  31. Zhang, X., Tian, Y., Jin, Y.: A knee point-driven evolutionary algorithm for many-objective optimization. IEEE Trans. Evol. Comput. 19(6), 761–776 (2015)

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by National Natural Science Foundation of China (Grant No. 62002152, 61876075), Guangdong Provincial Key Laboratory (Grant No. 2020B121201001), the Program for Guangdong Introducing Innovative and Enterpreneurial Teams (Grant No. 2017ZT07X386), The Stable Support Plan Program of Shenzhen Natural Science Fund (Grant No. 20200925174447003), Shenzhen Science and Technology Program (Grant No. KQTD2016112514355531).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hisao Ishibuchi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Nan, Y., Ishibuchi, H., Shu, T., Shang, K. (2023). Two-Stage Greedy Approximated Hypervolume Subset Selection for Large-Scale Problems. In: Emmerich, M., et al. Evolutionary Multi-Criterion Optimization. EMO 2023. Lecture Notes in Computer Science, vol 13970. Springer, Cham. https://doi.org/10.1007/978-3-031-27250-9_28

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-27250-9_28

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-27249-3

  • Online ISBN: 978-3-031-27250-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics