OneMax is not the easiest function for fitness improvements
DOI10.1007/978-3-031-30035-6_11arXiv2204.07017OpenAlexW4361798389MaRDI QIDQ6149099
Maxime Larcher, Johannes Lengler, Xun Zou, Marc Kaufmann
Publication date: 12 January 2024
Published in: Evolutionary Computation in Combinatorial Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2204.07017
evolutionary algorithmself-adaptationdynamic environmentsparameter controlOneMaxone-fifth rule\((1,\lambda )\)-EA
Evolutionary algorithms, genetic algorithms (computational aspects) (68W50) Approximation methods and heuristics in mathematical programming (90C59) Combinatorial optimization (90C27)
Related Items (1)
Cites Work
- Unnamed Item
- Concentration of first hitting times under additive drift
- From black-box complexity to designing new genetic algorithms
- A rigorous analysis of the compact genetic algorithm for linear functions
- Learning probability distributions in continuous evolutionary algorithms -- a comparative review
- Multiplicative drift analysis
- Optimal parameter choices via precise black-box analysis
- Runtime analysis of the \((\mu + 1)\)-EA on the dynamic BinVal function
- Self-adjusting mutation rates with provably optimal success rules
- On the runtime analysis of the simple genetic algorithm
- The choice of the offspring population size in the \((1,\lambda)\) evolutionary algorithm
- On easiest functions for mutation operators in bio-inspired optimisation
- Runtime analysis for self-adaptive mutation rates
- Tight Bounds on the Optimization Time of a Randomized Search Heuristic on Linear Functions
- Self-adjusting offspring population sizes outperform fixed parameters on the cliff function
- Theory of Evolutionary Computation
- On the Brittleness of Evolutionary Algorithms
- Self-adjusting population sizes for the (1,\( \lambda )\)-EA on monotone functions
This page was built for publication: OneMax is not the easiest function for fitness improvements