FlexiBO: A Decoupled Cost-Aware Multi-Objective Optimization Approach for Deep Neural Networks
From MaRDI portal
Publication:6333096
DOI10.1613/jair.1.14139arXiv2001.06588OpenAlexW4382792970MaRDI QIDQ6333096
Pooyan Jamshidi, Md Shahriar Iqbal, Jianhai Su, Lars Kotthoff
Publication date: 17 January 2020
Full work available at URL: https://doi.org/10.1613/jair.1.14139
Cites Work
- Unnamed Item
- Unnamed Item
- Multiple-gradient descent algorithm (MGDA) for multiobjective optimization
- Stochastic method for the solution of unconstrained vector optimization problems
- Efficient global optimization of expensive black-box functions
- Descent algorithm for nonsmooth stochastic multiobjective optimization
- Interactive Thompson sampling for multi-objective multi-armed bandits
- On using the hypervolume indicator to compare Pareto fronts: applications to multi-criteria optimal experimental design
- A Survey of Multi-Objective Sequential Decision-Making
- On Finding the Maxima of a Set of Vectors
- Output Space Entropy Search Framework for Multi-Objective Bayesian Optimization
- Information-Theoretic Regret Bounds for Gaussian Process Optimization in the Bandit Setting
- Advanced Lectures on Machine Learning
- Multiobjective optimization using Gaussian process emulators via stepwise uncertainty reduction
This page was built for publication: FlexiBO: A Decoupled Cost-Aware Multi-Objective Optimization Approach for Deep Neural Networks