Distributed robust regression with correntropy losses and regularization kernel networks
From MaRDI portal
Publication:6564882
DOI10.1142/s0219530523500355MaRDI QIDQ6564882
Publication date: 1 July 2024
Published in: Analysis and Applications (Singapore) (Search for Journal in Brave)
Nonparametric regression and quantile regression (62G08) Learning and adaptive systems in artificial intelligence (68T05) Sensitivity, stability, parametric optimization (90C31) Stochastic programming (90C15) Distributed algorithms (68W15)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Maximum correntropy Kalman filter
- Least square regression with indefinite kernels and coefficient regularization
- Families of alpha-, beta- and gamma-divergences: flexible and robust measures of similarities
- Model selection for regularized least-squares algorithm in learning theory
- Robustness of reweighted least squares kernel based regression
- Distributed kernel-based gradient descent algorithms
- Robust regression: Asymptotics, conjectures and Monte Carlo
- Regularization networks with indefinite kernels
- Distributed kernel gradient descent algorithm for minimum error entropy principle
- Universality of deep convolutional neural networks
- Optimal rates for coefficient-based regularized regression
- Optimal rates for the regularized least-squares algorithm
- On some extensions of Bernstein's inequality for self-adjoint operators
- Kernel-based sparse regression with the correntropy-induced loss
- Consistency and robustness of kernel-based regression in convex risk minimization
- Sous-espaces d'espaces vectoriels topologiques et noyaux associés. (Noyaux reproduisants.)
- Averaging versus voting: a comparative study of strategies for distributed classification
- Learning Theory
- Support Vector Machines
- Correntropy: Properties and Applications in Non-Gaussian Signal Processing
- Gradient descent for robust kernel-based regression
- Generalized correlation function: definition, properties, and application to blind equalization
- On the optimality of averaging in distributed statistical learning
- Robust Hyperspectral Unmixing With Correntropy-Based Metric
- Learning Rates of lq Coefficient Regularization Learning with Gaussian Kernel
- Information Theoretic Learning
- Distributed learning with indefinite kernels
- Indefinite Proximity Learning: A Review
- A General Qualitative Definition of Robustness
- Learning Bounds for Kernel Regression Using Effective Data Dimensionality
- Theory of Reproducing Kernels
- Optimal learning with Gaussians and correntropy loss
- Approximating functions with multi-features by deep convolutional neural networks
- Generalization Analysis of Pairwise Learning for Ranking With Deep Neural Networks
This page was built for publication: Distributed robust regression with correntropy losses and regularization kernel networks