Nothing Special   »   [go: up one dir, main page]

Skip to main content

Sparse Inverse Covariance Learning for CMA-ES with Graphical Lasso

  • Conference paper
  • First Online:
Parallel Problem Solving from Nature – PPSN XVI (PPSN 2020)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 12269))

Included in the following conference series:

  • 2338 Accesses

Abstract

This paper introduces a variant of the Covariance Matrix Adaptation Evolution Strategy (CMA-ES), denoted as gl-CMA-ES, that utilizes the Graphical Lasso regularization. Our goal is to efficiently solve partially separable optimization problems of a certain class by performing stochastic search with a search model parameterized by a sparse precision, i.e. inverse covariance matrix. We illustrate the effect of the global weight of the \(l_1\) regularizer and investigate how Graphical Lasso with non equal weights can be combined with CMA-ES, allowing to learn the conditional dependency structure of problems with sparse Hessian matrices. For non-separable sparse problems, the proposed method with appropriately selected weights, outperforms CMA-ES and improves its scaling, while for dense problems it maintains the same performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    An extension of CMA-ES, called Active CMA-ES [10], that performs recombination with both positive and negative weights using all sampled solutions has been proposed.

References

  1. Akimoto, Y., Hansen, N.: Online model selection for restricted covariance matrix adaptation. In: Handl, J., Hart, E., Lewis, P.R., López-Ibáñez, M., Ochoa, G., Paechter, B. (eds.) PPSN 2016. LNCS, vol. 9921, pp. 3–13. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-45823-6_1

    Chapter  Google Scholar 

  2. Akimoto, Y., Hansen, N.: Projection-based restricted covariance matrix adaptation for high dimension. In: Genetic and Evolutionary Computation Conference (GECCO 2016), Denver, United States, pp. 197–204, July 2016

    Google Scholar 

  3. Akimoto, Y., Hansen, N.: Diagonal acceleration for covariance matrix adaptation evolution strategies. Evol. Comput., 1–31 (2019)

    Google Scholar 

  4. d’Aspremont, A., Banerjee, O., El Ghaoui, L.: First-order methods for sparse covariance selection. SIAM J. Matrix Anal. Appl. 30(1), 56–66 (2008)

    Article  MathSciNet  Google Scholar 

  5. Dempster, A.P.: Covariance selection. Biometrics 28, 157–175 (1972)

    Article  MathSciNet  Google Scholar 

  6. Fan, J., Liao, Y., Liu, H.: An overview of the estimation of large covariance and precision matrices. Econom. J. 19(1), C1–C32 (2016)

    Article  MathSciNet  Google Scholar 

  7. Friedman, J., Hastie, T., Tibshirani, R.: Sparse inverse covariance estimation with the graphical lasso. Biostatistics 9(3), 432–441 (2008)

    Article  Google Scholar 

  8. Griewank, A., Toint, P.: On the unconstrained optimization of partially separable functions. In: Nonlinear Optimization 1981, pp. 301–312. Academic Press (1982)

    Google Scholar 

  9. Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)

    Article  Google Scholar 

  10. Jastrebski, G.A., Arnold, D.V.: Improving evolution strategies through active covariance matrix adaptation. In: 2006 IEEE International Conference on Evolutionary Computation, pp. 2814–2821. IEEE (2006)

    Google Scholar 

  11. Knight, J.N., Lunacek, M.: Reducing the space-time complexity of the CMA-ES. In: Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation. GECCO 2007, pp. 658–665. Association for Computing Machinery, New York (2007). https://doi.org/10.1145/1276958.1277097

  12. Laska, J., Narayan, M.: skggm 0.2.7: A scikit-learn compatible package for Gaussian and related Graphical Models, July 2017. https://doi.org/10.5281/zenodo.830033

  13. Loshchilov, I.: LM-CMA: an alternative to L-BFGS for large scale black-box optimization. Evol. Comput. 25, 143–171 (2017)

    Article  Google Scholar 

  14. Mazumder, R., Hastie, T.: Exact covariance thresholding into connected components for large-scale graphical lasso. J. Mach. Learn. Res. 13(1), 781–794 (2012)

    MathSciNet  MATH  Google Scholar 

  15. Nocedal, J., Wright, S.: Numerical Optimization. Springer, New York (2006). https://doi.org/10.1007/978-0-387-40065-5

    Book  MATH  Google Scholar 

  16. Ros, R., Hansen, N.: A simple modification in CMA-ES achieving linear time and space complexity. In: Rudolph, G., Jansen, T., Beume, N., Lucas, S., Poloni, C. (eds.) PPSN 2008. LNCS, vol. 5199, pp. 296–305. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-87700-4_30

    Chapter  Google Scholar 

  17. Varelas, K., et al.: A comparative study of large-scale variants of CMA-ES. In: Auger, A., Fonseca, C.M., Lourenço, N., Machado, P., Paquete, L., Whitley, D. (eds.) PPSN 2018. LNCS, vol. 11101, pp. 3–15. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-99253-2_1

    Chapter  Google Scholar 

Download references

Acknowledgement

The PhD thesis of Konstantinos Varelas is funded by the French MoD DGA/MRIS and Thales Land & Air Systems.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Konstantinos Varelas .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Varelas, K., Auger, A., Hansen, N. (2020). Sparse Inverse Covariance Learning for CMA-ES with Graphical Lasso. In: Bäck, T., et al. Parallel Problem Solving from Nature – PPSN XVI. PPSN 2020. Lecture Notes in Computer Science(), vol 12269. Springer, Cham. https://doi.org/10.1007/978-3-030-58112-1_49

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-58112-1_49

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-58111-4

  • Online ISBN: 978-3-030-58112-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics