scientific article; zbMATH DE number 7370593
From MaRDI portal
Publication:4998979
Publication date: 9 July 2021
Full work available at URL: https://arxiv.org/abs/2004.08436
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
discrepancy principlereproducing kernel Hilbert spaceoracle inequalityeffective dimensionearly stoppingnon-parametric regressionspectral regularization
Related Items (3)
A note on the prediction error of principal component regression in high dimensions ⋮ Towards adaptivity via a new discrepancy principle for Poisson inverse problems ⋮ From inexact optimization to learning via gradient concentration
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Regularization theory for ill-posed problems. Selected topics
- Kernel ridge vs. principal component regression: minimax bounds and the qualification of regularization operators
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Optimal rates for regularization of statistical inverse learning problems
- Variance estimation for high-dimensional regression models
- On regularization algorithms in learning theory
- Non-asymptotic adaptive prediction in functional linear models
- Variance function estimation in multivariate nonparametric regression with fixed design
- Early stopping for statistical inverse problems via truncated SVD estimation
- A nearest neighbor estimate of the residual variance
- A linear functional strategy for regularized ranking
- A distribution-free theory of nonparametric regression
- Weak convergence and empirical processes. With applications to statistics
- Adaptive kernel methods using the balancing principle
- Smoothed residual stopping for statistical inverse problems via truncated SVD estimation
- Residual variance estimation using a nearest neighbor statistic
- Optimal rates for spectral algorithms with least-squares regression over Hilbert spaces
- High-probability bounds for the reconstruction error of PCA
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Optimal rates for the regularized least-squares algorithm
- On some extensions of Bernstein's inequality for self-adjoint operators
- The discretized discrepancy principle under general source conditions
- Shannon sampling. II: Connections to learning theory
- Local Rademacher complexities
- Boosting with early stopping: convergence and consistency
- Learning theory estimates via integral operators and their approximations
- On early stopping in gradient descent learning
- On the mathematical foundations of learning
- Convergence rates of Kernel Conjugate Gradient for random design regression
- Discrepancy principle for statistical inverse problems with application to conjugate gradient iteration
- Mathematical Foundations of Infinite-Dimensional Statistical Models
- Support Vector Machines
- CROSS-VALIDATION BASED ADAPTATION FOR REGULARIZATION OPERATORS IN LEARNING THEORY
- A Technique for the Numerical Solution of Certain Integral Equations of the First Kind
- Boosting With theL2Loss
- High-Dimensional Probability
- Optimal Adaptation for Early Stopping in Statistical Inverse Problems
- Neural Network Learning
- Early Stopping for Kernel Boosting Algorithms: A General Analysis With Localized Complexities
- An Introduction to Matrix Concentration Inequalities
- Potential Functions in Mathematical Pattern Recognition
- Theory of Reproducing Kernels
- Introduction to nonparametric estimation
- Boosting methods for regression
This page was built for publication: