Goal-oriented sensitivity analysis of hyperparameters in deep learning
From MaRDI portal
Publication:6159254
DOI10.1007/s10915-022-02083-4arXiv2207.06216OpenAlexW4200629985MaRDI QIDQ6159254
David Lugato, Gaël Poëtte, Pietro Marco Congedo, Paul Novello
Publication date: 20 June 2023
Published in: Journal of Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2207.06216
sensitivity analysisinterpretabilityhyperparameter optimizationscientific machine learningHilbert-Schmidt independance criterion
Related Items (2)
Deterministic neural networks optimization from a continuous and energy point of view ⋮ Accelerating hypersonic reentry simulations using deep learning-based hybridization (with guarantees)
Cites Work
- Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization
- Making best use of model evaluations to compute sensitivity indices
- Multidimensional scaling by optimizing goodness of fit to a nonmetric hypothesis
- New sensitivity analysis subordinated to a contrast
- Integral Probability Metrics and Their Generating Classes of Functions
- Global sensitivity analysis with dependence measures
- Global Sensitivity Analysis for Optimization with Variable Selection
- Algorithmic Learning Theory
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Goal-oriented sensitivity analysis of hyperparameters in deep learning