Recently a mechanism called stagnation detection was proposed that automatically adjusts the mutation rate of evolutionary algorithms when they encounter local optima. The so-called introduced by Rajabi and Witt (GECCO 2020) adds stagnation detection to the classical with standard bit mutation, which flips each bit independently with some mutation rate, and raises the mutation rate when the algorithm is likely to have encountered local optima.
In this paper, we investigate stagnation detection in the context of the -bit flip operator of randomized local search that flips bits chosen uniformly at random and let stagnation detection adjust the parameter . We obtain improved runtime results compared to the amounting to a speed-up of up to Moreover, we propose additional schemes that prevent infinite optimization times even if the algorithm misses a working choice of due to unlucky events. Finally, we present an example where standard bit mutation still outperforms the local -bit flip with stagnation detection.
The Paper Story
Self-adjusting Evolutionary Algorithms can learn good or even near-optimal parameters settings.
- Examples of popular self-adjusting EAs:
These self-adjusting schemes work since they in some sense measure how successful different parameter choices are compared to each other. There is basically no signal for the self-adjusting scheme in a local optimum. Mostly, classical self-adjusting algorithms would take many unsuccessful steps in such situations and suggest to set the mutation rate to its minimum although that might not be the best choice to leave the local optimum. On multimodal problems, however, increasing the rate is often beneficial and existing self-adjusting EAs perform poorly in this situation.
- Multimodal functions have at least one point such that .
Example: In multimodal function, for all points such that .