A sieve stochastic gradient descent estimator for online nonparametric regression in Sobolev ellipsoids
From MaRDI portal
Publication:2105198
DOI10.1214/22-AOS2212MaRDI QIDQ2105198
Tian-Yu Zhang, Noah Robin Simon
Publication date: 8 December 2022
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2104.00846
Related Items (2)
Regression in Tensor Product Spaces by the Method of Sieves ⋮ A sieve stochastic gradient descent estimator for online nonparametric regression in Sobolev ellipsoids
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nonparametric stochastic approximation with large step-sizes
- Statistical and computational trade-offs in estimation of sparse principal components
- Mercer's theorem on general domains: on the interaction between measures, kernels, and RKHSs
- Minimax optimal rates of estimation in high dimensional additive models
- Mercer theorem for RKHS on noncompact sets
- A reproducing kernel Hilbert space approach to functional linear regression
- Minimax estimation in sparse canonical correlation analysis
- Tractability of multivariate problems. Volume I: Linear information
- Online gradient descent learning algorithms
- Additive regression and other nonparametric models
- Optimal rates of convergence for nonparametric estimators
- Sparse CCA: adaptive estimation and computational barriers
- A distribution-free theory of nonparametric regression
- A sieve stochastic gradient descent estimator for online nonparametric regression in Sobolev ellipsoids
- Just interpolate: kernel ``ridgeless regression can generalize
- Computational barriers in minimax submatrix detection
- Optimal rates for the regularized least-squares algorithm
- Computational and statistical boundaries for submatrix localization in a large noisy matrix
- On the mathematical foundations of learning
- On Martingale Extensions of Vapnik–Chervonenkis Theory with Applications to Online Learning
- Kernel-based Approximation Methods using MATLAB
- Online Learning as Stochastic Approximation of Regularization Paths: Optimality and Almost-Sure Convergence
- Curve fitting by polynomial-trigonometric regression
- Large-Scale Machine Learning with Stochastic Gradient Descent
- Support Vector Machines
- High-Dimensional Statistics
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
- Theory for penalised spline regression
- Computational Complexity
- An Online Projection Estimator for Nonparametric Regression in Reproducing Kernel Hilbert Spaces
This page was built for publication: A sieve stochastic gradient descent estimator for online nonparametric regression in Sobolev ellipsoids