Abstract
In noisy evolutionary optimization, sampling is a common strategy to deal with noise. By the sampling strategy, the fitness of a solution is evaluated multiple times (called sample size) independently, and its true fitness is then approximated by the average of these evaluations. Most previous studies on sampling are empirical, and the few theoretical studies mainly showed the effectiveness of sampling with a sufficiently large sample size. In this paper, we theoretically examine what strategies can work when sampling with any fixed sample size fails. By constructing a family of artificial noisy examples, we prove that sampling is always ineffective, while using parent or offspring populations can be helpful on some examples. We also construct an artificial noisy example to show that when using neither sampling nor populations is effective, a tailored adaptive sampling (i.e., sampling with an adaptive sample size) strategy can work. These findings may enhance our understanding of sampling to some extent, but future work is required to validate them in natural situations.
Similar content being viewed by others
References
Akimoto, Y., Astete-Morales, S., Teytaud, O.: Analysis of runtime of optimization algorithms for noisy functions over discrete codomains. Theoret. Comput. Sci. 605, 42–50 (2015)
Auger, A., Doerr, B.: Theory of Randomized Search Heuristics: Foundations and Recent Developments. World Scientific, Singapore (2011)
Bian, C., Qian, C., Tang, K.: Towards a running time analysis of the (1+1)-EA for OneMax and LeadingOnes under general bit-wise noise. In: Proceedings of the 15th International Conference on Parallel Problem Solving from Nature (PPSN’18), pp. 165–177. Coimbra, Portugal (2018)
Branke, J., Schmidt, C.: Sequential sampling in noisy environments. In: Proceedings of the 8th International Conference on Parallel Problem Solving from Nature (PPSN’04), pp. 202–211. Birmingham, UK (2004)
Cantú-Paz, E.: Adaptive sampling for noisy problems. In: Proceedings of the 6th ACM Conference on Genetic and Evolutionary Computation (GECCO’04), pp. 947–958. Seattle, WA (2004)
Dang, D.C., Lehre, P.K.: Efficient optimisation of noisy fitness functions with population-based evolutionary algorithms. In: Proceedings of the 13th ACM Conference on Foundations of Genetic Algorithms (FOGA’15), pp. 62–68. Aberystwyth, UK (2015)
Dang-Nhu, R., Dardinier, T., Doerr, B., Izacard, G., Nogneng, D.: A new analysis method for evolutionary optimization of dynamic and noisy objective functions. In: Proceedings of the 20th ACM Conference on Genetic and Evolutionary Computation (GECCO’18), pp. 1467–1474. Kyoto, Japan (2018)
Devroye, L., Lugosi, G.: Combinatorial Methods in Density Estimation. Springer, New York (2001)
Doerr, B., Hota, A., Kötzing, T.: Ants easily solve stochastic shortest path problems. In: Proceedings of the 14th ACM Conference on Genetic and Evolutionary Computation (GECCO’12), pp. 17–24. Philadelphia, PA (2012)
Doerr, B., Johannsen, D., Winzen, C.: Multiplicative drift analysis. Algorithmica 64(4), 673–697 (2012)
Droste, S.: Analysis of the (1+1) EA for a noisy OneMax. In: Proceedings of the 6th ACM Conference on Genetic and Evolutionary Computation (GECCO’04), pp. 1088–1099. Seattle, WA (2004)
Feldmann, M., Kötzing, T.: Optimizing expected path lengths with ant colony optimization using fitness proportional update. In: Proceedings of the 12th ACM Conference on Foundations of Genetic Algorithms (FOGA’13), pp. 65–74. Adelaide, Australia (2013)
Friedrich, T., Kötzing, T., Krejca, M., Sutton, A.: Robustness of ant colony optimization to noise. Evol. Comput. 24(2), 237–254 (2016)
Friedrich, T., Kötzing, T., Krejca, M., Sutton, A.: The compact genetic algorithm is efficient under extreme Gaussian noise. IEEE Trans. Evol. Comput. 21(3), 477–490 (2017)
Gießen, C., Kötzing, T.: Robustness of populations in stochastic environments. Algorithmica 75(3), 462–489 (2016)
Hajek, B.: Hitting-time and occupation-time bounds implied by drift analysis with applications. Adv. Appl. Probab. 14(3), 502–525 (1982)
He, J., Yao, X.: Drift analysis and average time complexity of evolutionary algorithms. Artif. Intell. 127(1), 57–85 (2001)
Li, G., Chou, W.: Path planning for mobile robot using self-adaptive learning particle swarm optimization. Sci. China Inf. Sci. 61(5), 052204 (2018)
Mukhopadhyay, A., Maulik, U., Bandyopadhyay, S., Coello Coello, C.A.: A survey of multiobjective evolutionary algorithms for data mining: Part I. IEEE Trans. Evol. Comput. 18(1), 4–19 (2013)
Neumann, F., Witt, C.: Bioinspired Computation in Combinatorial Optimization: Algorithms and Their Computational Complexity. Springer, Berlin (2010)
Oliveto, P., Witt, C.: Simplified drift analysis for proving lower bounds in evolutionary computation. Algorithmica 59(3), 369–386 (2011)
Oliveto, P., Witt, C.: Erratum: Simplified drift analysis for proving lower bounds in evolutionary computation. arXiv:1211.7184 (2012)
Oliveto, P., Witt, C.: On the runtime analysis of the simple genetic algorithm. Theoret. Comput. Sci. 545, 2–19 (2014)
Prügel-Bennett, A., Rowe, J., Shapiro, J.: Run-time analysis of population-based evolutionary algorithm in noisy environments. In: Proceedings of the 13th ACM Conference on Foundations of Genetic Algorithms (FOGA’15), pp. 69–75. Aberystwyth, UK (2015)
Qian, C.: Distributed Pareto optimization for large-scale noisy subset selection. IEEE Trans. Evol. Comput. (2020)
Qian, C., Bian, C., Jiang, W., Tang, K.: Running time analysis of the (1+1)-EA for OneMax and LeadingOnes under bit-wise noise. Algorithmica 81(2), 749–795 (2019)
Qian, C., Bian, C., Yu, Y., Tang, K., Yao, X.: Analysis of noisy evolutionary optimization when sampling fails. In: Proceedings of the 20th ACM Conference on Genetic and Evolutionary Computation (GECCO’18), pp. 1507–1514. Kyoto, Japan (2018)
Qian, C., Shi, J.C., Yu, Y., Tang, K., Zhou, Z.H.: Subset selection under noise. In: Advances in Neural Information Processing Systems 30 (NIPS’17), pp. 3562–3572. Long Beach, CA (2017)
Qian, C., Yu, Y., Tang, K., Jin, Y., Yao, X., Zhou, Z.H.: On the effectiveness of sampling for evolutionary optimization in noisy environments. Evol. Comput. 26(2), 237–267 (2018)
Qian, C., Yu, Y., Zhou, Z.H.: Analyzing evolutionary optimization in noisy environments. Evol. Comput. 26(1), 1–41 (2018)
Sudholt, D.: On the robustness of evolutionary algorithms to noise: Refined results and an example where noise helps. In: Proceedings of the 20th ACM Conference on Genetic and Evolutionary Computation (GECCO’18), pp. 1523–1530. Kyoto, Japan (2018)
Sudholt, D., Thyssen, C.: A simple ant colony optimizer for stochastic shortest path problems. Algorithmica 64(4), 643–672 (2012)
Syberfeldt, A., Ng, A., John, R., Moore, P.: Evolutionary optimisation of noisy multi-objective problems using confidence-based dynamic resampling. Eur. J. Oper. Res. 204(3), 533–544 (2010)
Tyurin, I.S.: An improvement of upper estimates of the constants in the Lyapunov theorem. Russ. Math. Surv. 65(3), 201–202 (2010)
Witt, C.: Runtime analysis of the (\(\mu\)+1) EA on simple pseudo-Boolean functions. Evol. Comput. 14(1), 65–86 (2006)
Xu, P., Liu, X., Cao, H., Zhang, Z.: An efficient energy aware virtual network migration based on genetic algorithm. Front. Comput. Sci. 13(2), 440–442 (2019)
Yu, Y., Qian, C., Zhou, Z.H.: Switch analysis for running time analysis of evolutionary algorithms. IEEE Trans. Evol. Comput. 19(6), 777–792 (2015)
Zhang, Z., Xin, T.: Immune algorithm with adaptive sampling in noisy environments and its application to stochastic optimization problems. IEEE Comput. Intell. Mag. 2(4), 29–40 (2007)
Zhou, Z.H., Yu, Y., Qian, C.: Evolutionary Learning: Advances in Theories and Algorithms. Springer, Singapore (2019)
Acknowledgements
We want to thank the anonymous reviewers of GECCO’18, TEvC and Algorithmica for their valuable comments and thank Per Kristian Lehre for helpful discussions. This work was supported by the National Key Research and Development Program of China (2017YFB1003102), the NSFC (61672478, 61876077), the Shenzhen Peacock Plan (KQTD2016112514355531), and the Fundamental Research Funds for the Central Universities.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
A preliminary version of this paper has appeared at GECCO’18 [27].
Rights and permissions
About this article
Cite this article
Qian, C., Bian, C., Yu, Y. et al. Analysis of Noisy Evolutionary Optimization When Sampling Fails. Algorithmica 83, 940–975 (2021). https://doi.org/10.1007/s00453-019-00666-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00453-019-00666-6