Abstract
This paper introduces a variant of the Covariance Matrix Adaptation Evolution Strategy (CMA-ES), denoted as gl-CMA-ES, that utilizes the Graphical Lasso regularization. Our goal is to efficiently solve partially separable optimization problems of a certain class by performing stochastic search with a search model parameterized by a sparse precision, i.e. inverse covariance matrix. We illustrate the effect of the global weight of the \(l_1\) regularizer and investigate how Graphical Lasso with non equal weights can be combined with CMA-ES, allowing to learn the conditional dependency structure of problems with sparse Hessian matrices. For non-separable sparse problems, the proposed method with appropriately selected weights, outperforms CMA-ES and improves its scaling, while for dense problems it maintains the same performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
An extension of CMA-ES, called Active CMA-ES [10], that performs recombination with both positive and negative weights using all sampled solutions has been proposed.
References
Akimoto, Y., Hansen, N.: Online model selection for restricted covariance matrix adaptation. In: Handl, J., Hart, E., Lewis, P.R., López-Ibáñez, M., Ochoa, G., Paechter, B. (eds.) PPSN 2016. LNCS, vol. 9921, pp. 3–13. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-45823-6_1
Akimoto, Y., Hansen, N.: Projection-based restricted covariance matrix adaptation for high dimension. In: Genetic and Evolutionary Computation Conference (GECCO 2016), Denver, United States, pp. 197–204, July 2016
Akimoto, Y., Hansen, N.: Diagonal acceleration for covariance matrix adaptation evolution strategies. Evol. Comput., 1–31 (2019)
d’Aspremont, A., Banerjee, O., El Ghaoui, L.: First-order methods for sparse covariance selection. SIAM J. Matrix Anal. Appl. 30(1), 56–66 (2008)
Dempster, A.P.: Covariance selection. Biometrics 28, 157–175 (1972)
Fan, J., Liao, Y., Liu, H.: An overview of the estimation of large covariance and precision matrices. Econom. J. 19(1), C1–C32 (2016)
Friedman, J., Hastie, T., Tibshirani, R.: Sparse inverse covariance estimation with the graphical lasso. Biostatistics 9(3), 432–441 (2008)
Griewank, A., Toint, P.: On the unconstrained optimization of partially separable functions. In: Nonlinear Optimization 1981, pp. 301–312. Academic Press (1982)
Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)
Jastrebski, G.A., Arnold, D.V.: Improving evolution strategies through active covariance matrix adaptation. In: 2006 IEEE International Conference on Evolutionary Computation, pp. 2814–2821. IEEE (2006)
Knight, J.N., Lunacek, M.: Reducing the space-time complexity of the CMA-ES. In: Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation. GECCO 2007, pp. 658–665. Association for Computing Machinery, New York (2007). https://doi.org/10.1145/1276958.1277097
Laska, J., Narayan, M.: skggm 0.2.7: A scikit-learn compatible package for Gaussian and related Graphical Models, July 2017. https://doi.org/10.5281/zenodo.830033
Loshchilov, I.: LM-CMA: an alternative to L-BFGS for large scale black-box optimization. Evol. Comput. 25, 143–171 (2017)
Mazumder, R., Hastie, T.: Exact covariance thresholding into connected components for large-scale graphical lasso. J. Mach. Learn. Res. 13(1), 781–794 (2012)
Nocedal, J., Wright, S.: Numerical Optimization. Springer, New York (2006). https://doi.org/10.1007/978-0-387-40065-5
Ros, R., Hansen, N.: A simple modification in CMA-ES achieving linear time and space complexity. In: Rudolph, G., Jansen, T., Beume, N., Lucas, S., Poloni, C. (eds.) PPSN 2008. LNCS, vol. 5199, pp. 296–305. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-87700-4_30
Varelas, K., et al.: A comparative study of large-scale variants of CMA-ES. In: Auger, A., Fonseca, C.M., Lourenço, N., Machado, P., Paquete, L., Whitley, D. (eds.) PPSN 2018. LNCS, vol. 11101, pp. 3–15. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-99253-2_1
Acknowledgement
The PhD thesis of Konstantinos Varelas is funded by the French MoD DGA/MRIS and Thales Land & Air Systems.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Varelas, K., Auger, A., Hansen, N. (2020). Sparse Inverse Covariance Learning for CMA-ES with Graphical Lasso. In: Bäck, T., et al. Parallel Problem Solving from Nature – PPSN XVI. PPSN 2020. Lecture Notes in Computer Science(), vol 12269. Springer, Cham. https://doi.org/10.1007/978-3-030-58112-1_49
Download citation
DOI: https://doi.org/10.1007/978-3-030-58112-1_49
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-58111-4
Online ISBN: 978-3-030-58112-1
eBook Packages: Computer ScienceComputer Science (R0)