Learning with correntropy-induced losses for regression with mixture of symmetric stable noise
From MaRDI portal
Publication:2300760
DOI10.1016/j.acha.2019.09.001zbMath1436.62308arXiv1803.00183OpenAlexW3101423575MaRDI QIDQ2300760
Publication date: 28 February 2020
Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1803.00183
Computational methods for problems pertaining to statistics (62-08) Nonparametric robustness (62G35) General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (9)
A Statistical Learning Approach to Modal Regression ⋮ Fast rates of minimum error entropy with heavy-tailed noise ⋮ Error analysis on regularized regression based on the maximum correntropy criterion ⋮ Kernel-based maximum correntropy criterion with gradient descent method ⋮ Learning under \((1 + \epsilon)\)-moment conditions ⋮ New Insights Into Learning With Correntropy-Based Regression ⋮ A Framework of Learning Through Empirical Gain Maximization ⋮ Optimal learning with Gaussians and correntropy loss ⋮ Robust wavelet-based estimation for varying coefficient dynamic models under long-dependent structures
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Consistency analysis of an empirical minimum error entropy algorithm
- Maximum correntropy Kalman filter
- Concentration estimates for learning with \(\ell ^{1}\)-regularizer and data dependent hypothesis spaces
- Optimal learning rates for least squares regularized regression with unbounded sampling
- Data mining. Concepts and techniques
- Multi-kernel regularized classifiers
- The C-loss function for pattern classification
- Properties of certain symmetric stable distributions
- On the behavior of Tukey's depth and median under symmetric stable distributions.
- Concentration estimates for learning with unbounded sampling
- Learning rates of least-square regularized regression
- Learning without Concentration
- Learning Theory
- Support Vector Machines
- Robust techniques for signal processing: A survey
- Finite-memory denoising in impulsive noise using Gaussian mixture models
- Correntropy: Properties and Applications in Non-Gaussian Signal Processing
- Soft-Decision Metrics for Coded Orthogonal Signaling in Symmetric Alpha-Stable Noise
- On-line Bayesian estimation of signals in symmetric /spl alpha/-stable noise
- Generalized correlation function: definition, properties, and application to blind equalization
- Robust Hyperspectral Unmixing With Correntropy-Based Metric
- Generalized Correntropy for Robust<?Pub _newline ?>Adaptive Filtering
- Outlier Analysis
- A Statistical Learning Approach to Modal Regression
- Information Theoretic Learning
- The stability test for symmetric alpha-stable distributions
- Robust Principal Component Analysis Based on Maximum Correntropy Criterion
- Robust Estimation of a Location Parameter
- Parameter Estimates for Symmetric Stable Distributions
- Stable Distributions in Statistical Inference: 1. Symmetric Stable Distributions Compared to Other Symmetric Long-Tailed Distributions
- Robust Statistics
This page was built for publication: Learning with correntropy-induced losses for regression with mixture of symmetric stable noise