The performance of semi-supervised Laplacian regularized regression with the least square loss
From MaRDI portal
Publication:2980112
DOI10.1142/S0219691317500163zbMath1360.41008MaRDI QIDQ2980112
Bao Huai Sheng, Dao-Hong Xiang
Publication date: 27 April 2017
Published in: International Journal of Wavelets, Multiresolution and Information Processing (Search for Journal in Brave)
Computational learning theory (68Q32) Convex programming (90C25) Learning and adaptive systems in artificial intelligence (68T05) Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces) (46E22) Rate of convergence, degree of approximation (41A25)
Related Items (5)
The kernel regularized learning algorithm for solving Laplace equation with Dirichlet boundary ⋮ Convergence of online pairwise regression learning with quadratic loss ⋮ The convergence rate of semi-supervised regression with quadratic loss ⋮ Convergence rate of SVM for kernel-based robust regression ⋮ Performance analysis of the LapRSSLG algorithm in learning theory
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Convergence rate of the semi-supervised greedy algorithm
- Consistency of regularized spectral clustering
- QMC rules of arbitrary high order: Reproducing kernel Hilbert space approach
- Derivative reproducing properties for kernel methods in learning theory
- A Bernstein-type inequality for \(U\)-statistics and \(U\)-processes
- Semi-supervised learning based on high density region estimation
- Generalization errors of Laplacian regularized least squares regression
- Semi-supervised local Fisher discriminant analysis for dimensionality reduction
- Learning to rank on graphs
- Learning sets with separating kernels
- The convergence rates of Shannon sampling learning algorithms
- Consistency and robustness of kernel-based regression in convex risk minimization
- Learning theory estimates via integral operators and their approximations
- On the mathematical foundations of learning
- Semi-supervised learning using ensembles of multiple 1D-embedding-based label boosting
- Semi-supervised learning using multiple one-dimensional embedding based adaptive interpolation
- A novel semi-supervised learning framework for hyperspectral image classification
- ERROR ANALYSIS FOR THE SPARSE GRAPH-BASED SEMI-SUPERVISED CLASSIFICATION ALGORITHM
- Nonparametric sparsity and regularization
- Learning Theory
- 10.1162/1532443041827925
- Semi-supervised learning for regression based on the diffusion matrix
- Regularization schemes for minimum error entropy principle
- The Learning Rate of lp -coefficient Regularized Shannon Sampling Algorithm
- Probability Inequalities for Sums of Bounded Random Variables
- Theory of Reproducing Kernels
- Convex analysis and monotone operator theory in Hilbert spaces
This page was built for publication: The performance of semi-supervised Laplacian regularized regression with the least square loss