Distributed regularized least squares with flexible Gaussian kernels
From MaRDI portal
Publication:2036424
DOI10.1016/j.acha.2021.03.008OpenAlexW3143035105MaRDI QIDQ2036424
Publication date: 29 June 2021
Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.acha.2021.03.008
Sobolev spacereproducing kernel Hilbert spacesemi-supervised learningdistributed learningflexible Gaussian kernels
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- User-friendly tail bounds for sums of random matrices
- Oracle inequalities for support vector machines that are based on random entropy numbers
- Regularization in kernel learning
- Fast rates for support vector machines using Gaussian kernels
- Distributed kernel-based gradient descent algorithms
- The covering number in learning theory
- Optimal regression rates for SVMs using Gaussian kernels
- Unregularized online algorithms with varying Gaussians
- Distributed kernel gradient descent algorithm for minimum error entropy principle
- Universality of deep convolutional neural networks
- Optimal rates for spectral algorithms with least-squares regression over Hilbert spaces
- Optimal rates for the regularized least-squares algorithm
- Divide and Conquer Kernel Ridge Regression: A Distributed Algorithm with Minimax Optimal Rates
- Support Vector Machines
- CROSS-VALIDATION BASED ADAPTATION FOR REGULARIZATION OPERATORS IN LEARNING THEORY
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Deep distributed convolutional neural networks: Universality
This page was built for publication: Distributed regularized least squares with flexible Gaussian kernels