In real-world optimization tasks, the objective (i.e., fitness) function evaluation is often disturbed by noise due to a wide range of uncertainties. Evolutionary algorithms are often employed in noisy optimization, where reducing the negative effect of noise is a crucial issue. Sampling is a popular strategy for dealing with noise: to estimate the fitness of a solution, it evaluates the fitness multiple (k) times independently and then uses the sample average to approximate the true fitness. Obviously, sampling can make the fitness estimation closer to the true value, but also increases the estimation cost. Previous studies mainly focused on empirical analysis and design of efficient sampling strategies, while the impact of sampling is unclear from a theoretical viewpoint. In this article, we show that sampling can speed up noisy evolutionary optimization exponentially via rigorous running time analysis. For the (1+1)-EA solving the OneMax and the LeadingOnes problems under prior (e.g., one-bit) or posterior (e.g., additive Gaussian) noise, we prove that, under a high noise level, the running time can be reduced from exponential to polynomial by sampling. The analysis also shows that a gap of one on the value of k for sampling can lead to an exponential difference on the expected running time, cautioning for a careful selection of k. We further prove by using two illustrative examples that sampling can be more effective for noise handling than parent populations and threshold selection, two strategies that have shown to be robust to noise. Finally, we also show that sampling can be ineffective when noise does not bring a negative impact. © 2018 by the Massachusetts Institute of Technology.
Bibliographical noteWe would like to thank Per Kristian Lehre for helpful discussions during the PPSN’14 conference, and thank the associate editor and anonymous reviewers for helpful comments and suggestions. This work was partially supported by the NSFC (61329302, 61333014, 61375061, 61603367, 61672478), the Joint Research Fund for Overseas Chinese, Hong Kong, and Macao Scholars of the NSFC (61428302), the EPSRC (EP/K001523/1), the Jiangsu Science Foundation (BK20160066), the Royal Society Newton Advanced Fellowship (NA150123), the Fundamental Research Funds for the Central Universities (WK2150110002), the CCF-Tencent Open Research Fund, and the Collaborative Innovation Center of Novel Software Technology and Industrialization. Xin Yao was also supported by a Royal Society Wolfson Research Merit Award.
- Computational complexity
- Evolutionary algorithms
- Optimization in noisy environments
- Robust optimization
- Running time analysis