Convergence bounds for empirical nonlinear least-squares
From MaRDI portal
Publication:5034774
DOI10.1051/m2an/2021070zbMath1482.62071arXiv2001.00639OpenAlexW4206540115MaRDI QIDQ5034774
Martin Eigel, Reinhold Schneider, Philipp Trunschke
Publication date: 21 February 2022
Published in: ESAIM: Mathematical Modelling and Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2001.00639
General nonlinear regression (62J02) Abstract approximation theory (approximation in normed linear spaces and other abstract spaces) (41A65) Rate of convergence, degree of approximation (41A25) Approximation by other special function classes (41A30)
Related Items
Adaptive Nonintrusive Reconstruction of Solutions to High-Dimensional Parametric PDEs ⋮ Approximative Policy Iteration for Exit Time Feedback Control Problems Driven by Stochastic Differential Equations using Tensor Train Format
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Shearlet approximation of functions with discontinuous derivatives
- On tensor completion via nuclear norm minimization
- Analysis of discrete \(L^2\) projection on polynomial spaces with random evaluations
- User-friendly tail bounds for sums of random matrices
- Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points
- Interpolation via weighted \(\ell_{1}\) minimization
- Infinite-dimensional compressed sensing and function interpolation
- On the convergence rate of sparse grid least squares regression
- Stable recovery of low-dimensional cones in Hilbert spaces: one RIP to rule them all
- A distribution-free theory of nonparametric regression
- Numerical solution of the parametric diffusion equation by deep neural networks
- Variational Monte Carlo -- bridging concepts of machine learning and high-dimensional partial differential equations
- Compressive Hermite interpolation: sparse, high-dimensional approximation from gradient-augmented measurements
- Stable als approximation in the TT-format for rank-adaptive tensor completion
- Low rank tensor recovery via iterative hard thresholding
- Iterative methods based on soft thresholding of hierarchical tensors
- An introduction to hierarchical (\(\mathcal H\)-) rank and TT-rank of tensors with examples
- Tail bounds via generic chaining
- On the mathematical foundations of learning
- BREAKING THE COHERENCE BARRIER: A NEW THEORY FOR COMPRESSED SENSING
- On the Minimax Risk of Dictionary Learning
- Tensor Spaces and Numerical Tensor Calculus
- Learning Theory
- Necessary and Sufficient Conditions for the Uniform Convergence of Means to their Expectations
- New tight frames of curvelets and optimal representations of objects with piecewise C2 singularities
- Optimal weighted least-squares methods
- Analysis of the Generalization Error: Empirical Risk Minimization over Deep Artificial Neural Networks Overcomes the Curse of Dimensionality in the Numerical Approximation of Black--Scholes Partial Differential Equations
- Discrete least squares polynomial approximation with random evaluations − application to parametric and stochastic elliptic PDEs
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Reproducing kernels of Sobolev spaces on ℝd and applications to embedding constants and tractability
- Stable signal recovery from incomplete and inaccurate measurements
This page was built for publication: Convergence bounds for empirical nonlinear least-squares