Learning and approximation by Gaussians on Riemannian manifolds
From MaRDI portal
Publication:960002
DOI10.1007/s10444-007-9049-0zbMath1156.68045OpenAlexW2140348539WikidataQ115384796 ScholiaQ115384796MaRDI QIDQ960002
Publication date: 16 December 2008
Published in: Advances in Computational Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10444-007-9049-0
learning theoryapproximationRiemannian manifoldsreproducing kernel Hilbert spacesGaussian kernelsmulti-kernel least square regularization scheme
Related Items
Bayesian manifold regression, Geometry on probability spaces, Multiscale regression on unknown manifolds, Learning rates of regularized regression on the unit sphere, Learning gradients on manifolds, Intrinsic Dimension Adaptive Partitioning for Kernel Methods, A deep network construction that adapts to intrinsic dimensionality beyond the domain, Optimal regression rates for SVMs using Gaussian kernels, Learning sparse gradients for variable selection and dimension reduction, Learning Rates for Classification with Gaussian Kernels, Parzen windows for multi-class classification, Learning rates for regularized classifiers using multivariate polynomial kernels, Learning Rates of lq Coefficient Regularization Learning with Gaussian Kernel, Learning rates of multi-kernel regularized regression, Sampling and Stability, Rademacher Chaos Complexities for Learning the Kernel Problem, Semi-supervised learning based on high density region estimation, A universal envelope for Gaussian processes and their kernels, Approximating and learning by Lipschitz kernel on the sphere, SVM LEARNING AND Lp APPROXIMATION BY GAUSSIANS ON RIEMANNIAN MANIFOLDS, Deep neural networks for rotation-invariance approximation and learning, Adaptive learning rates for support vector machines working on data with low intrinsic dimension, High order Parzen windows and randomized sampling, Minimax-optimal nonparametric regression in high dimensions
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Semi-supervised learning on Riemannian manifolds
- Model selection for regularized least-squares algorithm in learning theory
- Multi-kernel regularized classifiers
- The covering number in learning theory
- Regularization networks and support vector machines
- Fully online classification by regularization
- Consistency of spectral clustering
- Learning theory estimates via integral operators and their approximations
- Error bounds for learning the kernel
- Learning Theory
- Capacity of reproducing kernel spaces in learning theory
- Empirical graph Laplacian approximation of Laplace–Beltrami operators: Large sample results
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data
- Learning Theory
- Theory of Reproducing Kernels