Local convergence rates of the nonparametric least squares estimator with applications to transfer learning
From MaRDI portal
Publication:6565304
DOI10.3150/23-bej1655MaRDI QIDQ6565304
Johannes Schmidt-Hieber, Petr Zamolodtchikov
Publication date: 2 July 2024
Published in: Bernoulli (Search for Journal in Brave)
mean squared errornonparametric regressionminimax estimationnonparametric least squarestransfer learningdomain adaptationcovariate shiftlocal rates
Cites Work
- Unnamed Item
- Unnamed Item
- Maximum likelihood estimation of a log-concave density and its distribution function: basic properties and uniform consistency
- Estimating a regression function
- Optimal rates of convergence for nonparametric estimators
- The asymptotic behavior of monotone regression estimates
- Rates of convergence for minimum contrast estimators
- Locally adaptive regression splines
- A Bayesian/information theoretic model of learning to learn via multiple task sampling
- Improving predictive inference under covariate shift by weighting the log-likelihood function
- Singular measures and the key of \(G\)
- A regularity class for the roots of nonnegative functions
- Nonparametric shape-restricted regression
- Regularization and the small-ball method. I: Sparse recovery
- A distribution-free theory of nonparametric regression
- Estimation of a convex function: Characterizations and asymptotic theory.
- Weak convergence and empirical processes. With applications to statistics
- On the rate of convergence of fully connected deep neural network regression estimates
- Set structured global empirical risk minimizers are rate optimal in general dimensions
- Marginal singularity and the benefits of labels in covariate-shift
- Adaptive transfer learning
- On the robustness of minimum norm interpolators and regularized empirical risk minimizers
- On least squares estimation under heteroscedastic and heavy-tailed errors
- Nonparametric regression using deep neural networks with ReLU activation function
- On deep learning as a remedy for the curse of dimensionality in nonparametric regression
- Convergence rates of least squares regression estimators with heavy-tailed errors
- Distribution-free properties of isotonic regression
- Isotonic regression in general dimensions
- Three notes on perfect linear sets
- On consistency in monotonic regression
- Transfer learning for nonparametric classification: minimax rate and adaptive classifier
- Maximum Likelihood Estimates of Monotone Parameters
- On the Estimation of Parameters Restricted by Inequalities
- Bounding the Smallest Singular Value of a Random Matrix Without Concentration
- High-Dimensional Statistics
- Regularization and the small-ball method II: complexity dependent error rates
- Learning Theory and Kernel Machines
- Smoothing Lipschitz functions
- Introduction to nonparametric estimation
- Adaptation to lowest density regions with application to support recovery
This page was built for publication: Local convergence rates of the nonparametric least squares estimator with applications to transfer learning