Amirhossein Rajabi
Amirhossein Rajabi
Stagnation Detection with Randomized Local Search
📝

Stagnation Detection with Randomized Local Search

By Amirhossein Rajabi, Carsten Witt, in Proc. of EvoCOP 2021.

  • DOI
  • arXiv
‣
Abstract

Recently a mechanism called stagnation detection was proposed that automatically adjusts the mutation rate of evolutionary algorithms when they encounter local optima. The so-called introduced by Rajabi and Witt (GECCO 2020) adds stagnation detection to the classical with standard bit mutation, which flips each bit independently with some mutation rate, and raises the mutation rate when the algorithm is likely to have encountered local optima.

In this paper, we investigate stagnation detection in the context of the -bit flip operator of randomized local search that flips bits chosen uniformly at random and let stagnation detection adjust the parameter . We obtain improved runtime results compared to the amounting to a speed-up of up to Moreover, we propose additional schemes that prevent infinite optimization times even if the algorithm misses a working choice of due to unlucky events. Finally, we present an example where standard bit mutation still outperforms the local -bit flip with stagnation detection.

The Paper Story

Self-adjusting Mechanisms

Self-adjusting Evolutionary Algorithms can learn good or even near-optimal parameters settings.

  • Examples of popular self-adjusting EAs:
👉
GA with 1/5-rule: if then if then
👉
EA with two rates Create offspring by flipping each bit with prob. Create offspring by flipping each bit with prob. With prob. , replace with the strength that the best offspring has been created with (but with probability do a random step).

These self-adjusting schemes work since they in some sense measure how successful different parameter choices are compared to each other. There is basically no signal for the self-adjusting scheme in a local optimum. Mostly, classical self-adjusting algorithms would take many unsuccessful steps in such situations and suggest to set the mutation rate to its minimum although that might not be the best choice to leave the local optimum. On multimodal problems, however, increasing the rate is often beneficial and existing self-adjusting EAs perform poorly in this situation.

⚠️
Most self-adjusting EAs perform poorly in local optima.
👉
Stagnation Detection is an efficient solution proposed in GECCO20 (story). Stagnation Detection with RLS can be even more efficient.

Multimodal Functions

➕
Definition of "gap": Given a fitness function
  • Multimodal functions have at least one point such that .

Example: In multimodal function, for all points such that .

TODO...

The Poster

image