Optimal learning with Gaussians and correntropy loss
DOI10.1142/S0219530519410124zbMath1462.68159OpenAlexW2989253080MaRDI QIDQ5856264
Publication date: 25 March 2021
Published in: Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1142/s0219530519410124
General nonlinear regression (62J02) Minimax problems in mathematical programming (90C47) Learning and adaptive systems in artificial intelligence (68T05) Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces) (46E22) Approximation by arbitrary nonlinear expressions; widths and entropy (41A46)
Related Items (14)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Consistency analysis of an empirical minimum error entropy algorithm
- Quantile regression with \(\ell_1\)-regularization and Gaussian kernels
- Optimal learning rates for least squares regularized regression with unbounded sampling
- Learning rates for kernel-based expectile regression
- Adaptive Bayesian estimation using a Gaussian random field with inverse gamma bandwidth
- Regularization in kernel learning
- Fast rates for support vector machines using Gaussian kernels
- Blind source separation using Renyi's \(\alpha\)-marginal entropies.
- Optimal regression rates for SVMs using Gaussian kernels
- Learning with correntropy-induced losses for regression with mixture of symmetric stable noise
- Correntropy as a novel measure for nonlinearity tests
- Optimal rates for the regularized least-squares algorithm
- An RKHS approach to estimate individualized treatment rules based on functional predictors
- Learning rates for the risk of kernel-based quantile regression estimators in additive models
- The MEE Principle in Data Classification: A Perceptron-Based Analysis
- Learning Theory
- Support Vector Machines
- Correntropy: Properties and Applications in Non-Gaussian Signal Processing
- Gradient descent for robust kernel-based regression
- Generalized correlation function: definition, properties, and application to blind equalization
- A Statistical Learning Approach to Modal Regression
- Information Theoretic Learning
- A Regularized Correntropy Framework for Robust Pattern Recognition
- Distributed learning with indefinite kernels
- Regularization schemes for minimum error entropy principle
- Learning rates for regularized least squares ranking algorithm
- Robust Principal Component Analysis Based on Maximum Correntropy Criterion
- Learning Rates for Classification with Gaussian Kernels
This page was built for publication: Optimal learning with Gaussians and correntropy loss