Tuning Algorithms for Stochastic Black-Box Optimization: State of the Art and Future Perspectives
From MaRDI portal
Publication:5153496
DOI10.1007/978-3-030-66515-9_3OpenAlexW3167513112MaRDI QIDQ5153496
Frederik Rehbach, Margarita Rebolledo, Thomas Bartz-Beielstein
Publication date: 30 September 2021
Published in: Black Box Optimization, Machine Learning, and No-Free Lunch Theorems (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/978-3-030-66515-9_3
Learning and adaptive systems in artificial intelligence (68T05) Approximation methods and heuristics in mathematical programming (90C59)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Optimization of algorithms with OPAL
- Evolutionary self-adaptation: a survey of operators and strategy parameters
- Parameter setting in evolutionary algorithms.
- Efficient global optimization of expensive black-box functions
- Handbook of test problems in local and global optimization
- The design and analysis of computer experiments.
- Test problem generator by neural network for algorithms that try solving nonlinear programming problems globally
- Design and analysis of computer experiments. With comments and a rejoinder by the authors
- Best practices for comparing optimization algorithms
- Stacked regressions
- Designing and reporting on computational experiments with heuristic methods
- PAVER 2.0: an open source environment for automated performance analysis of benchmarking data
- No free lunch theorem: a review
- The optimization test environment
- A new class of test functions for global optimization
- Experimental research in evolutionary computation. The new experimentalism
- Global optimization of stochastic black-box systems via sequential kriging meta-models
- Experimental Methods for the Analysis of Optimization Algorithms
- The Future of Experimental Research
- Mixed Models for the Analysis of Optimization Algorithms
- Tuning an Algorithm Using Design of Experiments
- The Sequential Parameter Optimization Toolbox
- Design and Analysis of Optimization Algorithms Using Computational Statistics
- Randomly Generated Test Problems for Positive Definite Quadratic Programming
- Optimization by Simulated Annealing: An Experimental Evaluation; Part I, Graph Partitioning
- Multi-fidelity optimization via surrogate modelling
- Operational Reasoning for Concurrent Caml Programs and Weak Memory Models
- ParamILS: An Automatic Algorithm Configuration Framework
- Testing Unconstrained Optimization Software
- Optimization by Simulated Annealing: An Experimental Evaluation; Part II, Graph Coloring and Number Partitioning
- Feature Article—Reporting Computational Experiments with Parallel Algorithms: Issues, Measures, and Experts' Opinions
- CUTE
- Algorithm 709: testing algorithm implementations
- Sensitivity Analysis in Practice
- Feature Article—Toward an Experimental Method for Algorithm Simulation
- Benchmarking Derivative-Free Optimization Algorithms
- A Note on Performance Profiles for Benchmarking Software
- A Comparison of Several Current Optimization Methods, and the use of Transformations in Constrained Problems
- A Simplex Method for Function Minimization
- Finding Optimal Algorithmic Parameters Using Derivative‐Free Optimization
- Design and analysis of simulation experiments
- Introduction to evolutionary computing
- Design and analysis of simulation experiments
- Using experimental design to find effective parameter settings for heuristics
- The elements of statistical learning. Data mining, inference, and prediction
- Experimental evaluation of heuristic optimization algorithms: A tutorial
- Benchmarking optimization software with performance profiles.
This page was built for publication: Tuning Algorithms for Stochastic Black-Box Optimization: State of the Art and Future Perspectives