Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization
DOI10.48550/arXiv.1603.06560zbMath1468.68204arXiv1603.06560MaRDI QIDQ72746
Giulia DeSalvo, Kevin Jamieson, Lisha Li, Afshin Rostamizadeh, Ameet Talwalkar, Giulia Desalvo, Kevin Jamieson, Ameet Talwalkar, Afshin Rostamizadeh, Lisha Li
Publication date: 21 March 2016
Full work available at URL: https://arxiv.org/abs/1603.06560
Artificial neural networks and deep learning (68T07) Learning and adaptive systems in artificial intelligence (68T05) Approximation methods and heuristics in mathematical programming (90C59) Problem solving in the context of artificial intelligence (heuristics, search strategies, etc.) (68T20) Online algorithms; streaming algorithms (68W27)
Related Items (43)
Uses Software
Cites Work
- Fast Bayesian hyperparameter optimization on large datasets
- Bayesian Optimization in a Billion Dimensions via Random Embeddings
- Efficient Multi-Start Strategies for Local Search Algorithms
- Pure Exploration in Multi-armed Bandits Problems
- Information-Theoretic Regret Bounds for Gaussian Process Optimization in the Bandit Setting
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization