Amirhossein Rajabi
Amirhossein Rajabi
Stagnation Detection with Randomized Local Search
📝

Stagnation Detection with Randomized Local Search

By Amirhossein Rajabi, Carsten Witt, in Proc. of EvoCOP 2021.

  • DOI
  • arXiv
‣
Abstract

The Paper Story

Self-adjusting Mechanisms

Self-adjusting Evolutionary Algorithms can learn good or even near-optimal parameters settings.

  • Examples of popular self-adjusting EAs:
👉

GA with 1/5-rule: if then if then

👉

EA with two rates Create offspring by flipping each bit with prob. Create offspring by flipping each bit with prob. With prob. , replace with the strength that the best offspring has been created with (but with probability do a random step).

These self-adjusting schemes work since they in some sense measure how successful different parameter choices are compared to each other. There is basically no signal for the self-adjusting scheme in a local optimum. Mostly, classical self-adjusting algorithms would take many unsuccessful steps in such situations and suggest to set the mutation rate to its minimum although that might not be the best choice to leave the local optimum. On multimodal problems, however, increasing the rate is often beneficial and existing self-adjusting EAs perform poorly in this situation.

⚠️

Most self-adjusting EAs perform poorly in local optima.

👉

Stagnation Detection is an efficient solution proposed in GECCO20 (story). Stagnation Detection with RLS can be even more efficient.

Multimodal Functions

➕

Definition of "gap": Given a fitness function

  • Multimodal functions have at least one point such that .

Example: In multimodal function, for all points such that .

TODO...

The Poster

image