Moduli of smoothness, \(K\)-functionals and Jackson-type inequalities associated with Kernel function approximation in learning theory
From MaRDI portal
Publication:6587592
DOI10.1142/s021953052450009xzbMATH Open1546.41014MaRDI QIDQ6587592
Publication date: 14 August 2024
Published in: Analysis and Applications (Singapore) (Search for Journal in Brave)
learning theoryspherical harmonics\(K\)-functionalJackson inequalitymodulus of smoothnessreproducing kernel Hilbert spacesemigroup operatorkernel function approximation
Computational learning theory (68Q32) Multipliers for harmonic analysis in several variables (42B15) Rate of convergence, degree of approximation (41A25)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Sobolev orthogonal polynomials on the unit ball via outward normal derivatives
- Approximation of eigenfunctions in kernel-based spaces
- Reproducing kernel Hilbert spaces associated with kernels on topological spaces
- Weighted Sobolev orthogonal polynomials on the unit ball
- Mercer's theorem on general domains: on the interaction between measures, kernels, and RKHSs
- Probabilistic and average widths of Sobolev spaces on compact two-point homogeneous spaces equipped with a Gaussian measure
- New moduli of smoothness on the unit ball and other domains, introduction and main properties
- On Sobolev orthogonal polynomials
- Complexity of numerical integration over spherical caps in a Sobolev space setting
- Mercer theorem for RKHS on noncompact sets
- Radial kernels and their reproducing kernel Hilbert spaces
- Spherical harmonics and approximations on the unit sphere. An introduction
- The convergence rate of a regularized ranking algorithm
- Numerical integration over spheres of arbitrary dimension
- On regularization algorithms in learning theory
- Multi-kernel regularized classifiers
- Weighted Fourier-Laplace transforms in reproducing kernel Hilbert spaces on the sphere
- Sobolev orthogonal polynomials defined via gradient on the unit ball
- Behavior of a functional in learning theory
- The convergence rate for a \(K\)-functional in learning theory
- Best approximation of functions on the ball on the weighted Sobolev space equipped with a Gaussian measure
- Widths of weighted Sobolev classes on the ball
- A note on application of integral operator in learning theory
- Fractional derivatives and best approximations
- Approximation in Sobolev spaces by kernel expansions
- Approximation numbers of Sobolev and Gevrey type embeddings on the sphere and on the ball -- preasymptotics, asymptotics, and tractability
- Weighted approximation of functions on the unit sphere
- Generalized translation operator and approximation in several variables
- Best choices for regularization parameters in learning theory: on the bias-variance problem.
- Positive definiteness, reproducing kernel Hilbert spaces and beyond
- Optimal learning with anisotropic Gaussian SVMs
- Theory of deep convolutional neural networks. II: Spherical analysis
- Rates of approximation by neural network interpolation operators
- Learning rates for the kernel regularized regression with a differentiable strongly convex loss
- Characterization of Sobolev spaces on the sphere
- Approximation of functions on the Sobolev space on the sphere in the average case setting
- Gaussian bounds for the weighted heat kernels on the interval, ball, and simplex
- Density problem and approximation error in learning theory
- Application of integral operator for regularized least-square regression
- On approximation by reproducing kernel spaces in weighted \(L^p\) spaces
- Learning rates of least-square regularized regression
- Shannon sampling. II: Connections to learning theory
- New moduli of smoothness on the unit ball, applications and computability
- Learning theory estimates via integral operators and their approximations
- On the mathematical foundations of learning
- Learning rates for the risk of kernel-based quantile regression estimators in additive models
- On Dimension-independent Rates of Convergence for Function Approximation with Gaussian Kernels
- Best Polynomial Approximation on the Unit Sphere and the Unit Ball
- Kernel Approximation on Manifolds I: Bounding the Lebesgue Constant
- Kernel Approximation on Manifolds II: The $L_{\infty}$ Norm of the $L_2$ Projector
- Orthogonal polynomials and partial differential equations on the unit ball
- Learning Theory
- Kernel techniques: From machine learning to meshless methods
- An Explicit Description of the Reproducing Kernel Hilbert Spaces of Gaussian RBF Kernels
- Summability of Fourier orthogonal series for Jacobi weight on a ball in ℝ^{𝕕}
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Deep distributed convolutional neural networks: Universality
- Fourier series of Jacobi–Sobolev polynomials
- Strong converse inequality for Poisson sums
- Reproducing Properties of Differentiable Mercer-Like Kernels on the Sphere
- Approximation Theory and Harmonic Analysis on Spheres and Balls
- The kernel regularized learning algorithm for solving Laplace equation with Dirichlet boundary
- On the K-functional in learning theory
- Gaussian bounds for the heat kernels on the ball and the simplex: classical approach
- Convergence analysis of distributed multi-penalty regularized pairwise learning
- Analysis of regularized Nyström subsampling for regression functions of low smoothness
- Spectral Approximation on the Unit Ball
- Sobolev Orthogonal Polynomials on a Simplex
- Thresholded spectral algorithms for sparse approximations
- Learning rates for regularized least squares ranking algorithm
- Theory of Reproducing Kernels
- Optimal learning with Gaussians and correntropy loss
- Strong converse inequalities
- Some properties of Gaussian reproducing kernel Hilbert spaces and their implications for function approximation and learning theory
- Neural network interpolation operators of multivariate functions
- On the density of translation networks defined on the unit ball
This page was built for publication: Moduli of smoothness, \(K\)-functionals and Jackson-type inequalities associated with Kernel function approximation in learning theory