Nothing Special   »   [go: up one dir, main page]

Skip to main content

Neuromemetic Evolutionary Optimization

  • Conference paper
  • First Online:
Parallel Problem Solving from Nature – PPSN XVI (PPSN 2020)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 12269))

Included in the following conference series:

  • 2309 Accesses

Abstract

Discrete and combinatorial optimization can be notoriously difficult due to complex and rugged characteristics of the objective function. We address this challenge by mapping the search process to a continuous space using recurrent neural networks. Alongside with an evolutionary run, we learn three mappings: from the original search space to a continuous Cartesian latent space, from that latent space back to the search space, and from the latent space to the search objective. We elicit gradient from that last network and use it to perform moves in the latent space, and apply this Neuromemetic Evolutionary Optimization (NEO) to evolutionary synthesis of programs. Evaluation on a range of benchmarks suggests that NEO significantly outperforms conventional genetic programming.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014)

  2. Bello, I., Pham, H., Le, Q.V., Norouzi, M., Bengio, S.: Neural combinatorial optimization with reinforcement learning. arXiv preprint arXiv:1611.09940 (2016)

  3. Brownlee, A.E., Woodward, J.R., Swan, J.: Metaheuristic design pattern: Surrogate fitness functions. In: Proceedings of the Companion Publication of the 2015 Annual Conference on Genetic and Evolutionary Computation, GECCO Companion 2015, pp. 1261–1264. Association for Computing Machinery, New York (2015). https://doi.org/10.1145/2739482.2768499

  4. Clark, D.M.: Evolution of algebraic terms 1: term to term operation continuity. Int. J. Algebra Comput. 23(05), 1175–1205 (2013). https://doi.org/10.1142/S0218196713500227. http://www.worldscientific.com/doi/abs/10.1142/S0218196713500227d

    Article  MathSciNet  MATH  Google Scholar 

  5. Ffrancon, R., Schoenauer, M.: Memetic semantic genetic programming. In: GECCO 2015: Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, pp. 1023–1030. ACM, Madrid, 11–15 July 2015. https://doi.org/10.1145/2739480.2754697

  6. Hildebrandt, T., Branke, J.: On using surrogates with genetic programming. Evol. Comput. 23(3), 343–367 (2015). https://doi.org/10.1162/EVCO_a_00133

    Article  Google Scholar 

  7. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997). https://doi.org/10.1162/neco.1997.9.8.1735

    Article  Google Scholar 

  8. Jin, Y., Olhofer, M., Sendhoff, B.: A framework for evolutionary optimization with approximate fitness functions. IEEE Trans. Evol. Comput. 6, 481–494 (2002)

    Article  Google Scholar 

  9. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization (2014)

    Google Scholar 

  10. Koza, J.R.: Genetic Programming: On the Programming of Computers by Means of Natural Selection. MIT Press, Cambridge (1992)

    MATH  Google Scholar 

  11. Krawiec, K.: Behavioral Program Synthesis with Genetic Programming. Studies in Computational Intelligence, vol. 618. Springer, Heidelberg (2016). https://doi.org/10.1007/978-3-319-27565-9

    Book  Google Scholar 

  12. Liskowski, P., Bładek, I., Krawiec, K.: Neuro-guided genetic programming: prioritizing evolutionary search with neural networks. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 1143–1150 (2018)

    Google Scholar 

  13. Liskowski, P., Krawiec, K.: Non-negative matrix factorization for unsupervised derivation of search objectives in genetic programming. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 749–756 (2016)

    Google Scholar 

  14. Liskowski, P., Krawiec, K.: Surrogate fitness via factorization of interaction matrix. In: Heywood, M.I., McDermott, J., Castelli, M., Costa, E., Sim, K. (eds.) EuroGP 2016. LNCS, vol. 9594, pp. 68–82. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-30668-1_5

    Chapter  MATH  Google Scholar 

  15. Luong, M.T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. arXiv preprint arXiv:1508.04025 (2015)

  16. Moscato, P.: On evolution, search, optimization, genetic algorithms and martial arts: towards memetic algorithms. Caltech Concurrent Computation Program C3P Report 826 (1989)

    Google Scholar 

  17. Nazari, M., Oroojlooy, A., Snyder, L., Takác, M.: Reinforcement learning for solving the vehicle routing problem. In: Advances in Neural Information Processing Systems, pp. 9839–9849 (2018)

    Google Scholar 

  18. Saxe, A.M., McClelland, J.L., Ganguli, S.: Exact solutions to the nonlinear dynamics of learning in deep linear neural networks. arXiv preprint arXiv:1312.6120 (2013)

  19. Spector, L., Clark, D.M., Lindsay, I., Barr, B., Klein, J.: Genetic programming for finite algebras. In: Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, GECCO 2008, pp. 1291–1298. Association for Computing Machinery, New York (2008). https://doi.org/10.1145/1389095.1389343

  20. Vinyals, O., Fortunato, M., Jaitly, N.: Pointer networks. In: Advances in Neural Information Processing Systems, pp. 2692–2700 (2015)

    Google Scholar 

  21. Williams, R.J., Zipser, D.: A learning algorithm for continually running fully recurrent neural networks. Neural Comput. 1(2), 270–280 (1989). https://doi.org/10.1162/neco.1989.1.2.270

    Article  Google Scholar 

Download references

Acknowledgments

This research has been partially supported by the statutory funds of Poznan University of Technology.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Paweł Liskowski .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liskowski, P., Krawiec, K., Toklu, N.E. (2020). Neuromemetic Evolutionary Optimization. In: Bäck, T., et al. Parallel Problem Solving from Nature – PPSN XVI. PPSN 2020. Lecture Notes in Computer Science(), vol 12269. Springer, Cham. https://doi.org/10.1007/978-3-030-58112-1_43

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-58112-1_43

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-58111-4

  • Online ISBN: 978-3-030-58112-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics