Sample complexity bounds for the local convergence of least squares approximation
From MaRDI portal
Publication:6649924
DOI10.1142/s0219530524500271MaRDI QIDQ6649924
Publication date: 6 December 2024
Published in: Analysis and Applications (Singapore) (Search for Journal in Brave)
Analysis of algorithms and problem complexity (68Q25) Monte Carlo methods (65C05) Numerical optimization and variational techniques (65K10) Applications of functional analysis in optimization, convex analysis, mathematical programming, economics (46N10) Algorithms for approximation of functions (65D15) Variational problems in infinite-dimensional spaces (58E99)
Cites Work
- Unnamed Item
- Unnamed Item
- On tensor completion via nuclear norm minimization
- Low-rank tensor completion by Riemannian optimization
- Dimensionality reduction with subgaussian matrices: a unified theory
- The solution path of the generalized lasso
- Interpolation via weighted \(\ell_{1}\) minimization
- The expression of a tensor or a polyadic as a sum of products.
- On the convergence rate of sparse grid least squares regression
- Stable recovery of low-dimensional cones in Hilbert spaces: one RIP to rule them all
- Variational Monte Carlo -- bridging concepts of machine learning and high-dimensional partial differential equations
- Non-intrusive tensor reconstruction for high-dimensional random PDEs
- Low rank tensor recovery via iterative hard thresholding
- A literature survey of low-rank tensor approximation techniques
- On Statistical Properties of Sets Fulfilling Rolling-Type Conditions
- Tensor Spaces and Numerical Tensor Calculus
- Curvature Measures
- The Distribution of Rademacher Sums
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Geometric and Topological Inference
- Optimal weighted least-squares methods
- Compressive Imaging: Structure, Sampling, Learning
- Convergence bounds for empirical nonlinear least-squares
- A machine learning approach to optimal Tikhonov regularization I: Affine manifolds
- Approximating Optimal feedback Controllers of Finite Horizon Control Problems Using Hierarchical Tensor Formats
- A Probabilistic and RIPless Theory of Compressed Sensing
- Recipes for Stable Linear Embeddings From Hilbert Spaces to $ {\mathbb {R}}^{m}$
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Compressed Data Separation With Redundant Dictionaries
- Stable signal recovery from incomplete and inaccurate measurements
- Generalization of a Probability Limit Theorem of Cramer
- Weighted lp − l1 minimization methods for block sparse recovery and rank minimization
- Approximative Policy Iteration for Exit Time Feedback Control Problems Driven by Stochastic Differential Equations using Tensor Train Format
- For interpolating kernel machines, minimizing the norm of the ERM solution maximizes stability
- Towards optimal sampling for learning sparse approximation in high dimensions
- Proof of the theory-to-practice gap in deep learning via sampling complexity bounds for neural network approximation spaces
- Approximating the stationary Bellman equation by hierarchical tensor products
This page was built for publication: Sample complexity bounds for the local convergence of least squares approximation