Nonasymptotic analysis of robust regression with modified Huber's loss
From MaRDI portal
Publication:2693696
DOI10.1016/j.jco.2023.101744OpenAlexW4322625819MaRDI QIDQ2693696
Publication date: 24 March 2023
Published in: Journal of Complexity (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jco.2023.101744
nonparametric regressionconcentration inequalityreproducing kernel Hilbert spaceempirical risk minimizationnonasymptotic analysis
Nonparametric robustness (62G35) General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05)
Uses Software
Cites Work
- Unnamed Item
- Kernel ridge vs. principal component regression: minimax bounds and the qualification of regularization operators
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Optimal learning rates for least squares regularized regression with unbounded sampling
- Oracle inequalities for support vector machines that are based on random entropy numbers
- Learning under \((1 + \epsilon)\)-moment conditions
- Model selection for regularized least-squares algorithm in learning theory
- A new perspective on robust \(M\)-estimation: finite sample theory and applications to dependence-adjusted multiple testing
- About the constants in Talagrand's concentration inequalities for empirical processes.
- Weak convergence and empirical processes. With applications to statistics
- A statistical learning assessment of Huber regression
- On a regularization of unsupervised domain adaptation in RKHS
- Adaptive Huber regression on Markov-dependent data
- Balancing principle in supervised learning for a general regularization scheme
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Optimal rates for the regularized least-squares algorithm
- Learning rates of least-square regularized regression
- Local Rademacher complexities
- Learning theory estimates via integral operators and their approximations
- Adaptive Huber Regression
- Support Vector Machines
- Are Loss Functions All the Same?
- New Insights Into Learning With Correntropy-Based Regression
- A robust and efficient variable selection method for linear regression
- An Introduction to Artificial Intelligence Based on Reproducing Kernel Hilbert Spaces
- Robust Variable Selection With Exponential Squared Loss
- How general are general source conditions?
- Robust Estimation of a Location Parameter
- Theory of Reproducing Kernels
- Some applications of concentration inequalities to statistics
This page was built for publication: Nonasymptotic analysis of robust regression with modified Huber's loss