scientific article; zbMATH DE number 7307489
From MaRDI portal
Publication:5149262
Lecué Guillaume, Chinot Geoffrey, Lerasle Matthieu
Publication date: 8 February 2021
Full work available at URL: https://arxiv.org/abs/1905.04281
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
SLOPEtotal variationgroup LassoLassorobust learningLipschtiz and convex loss functionsRademacher complexity boundssparsity bounds
Related Items (2)
Robust supervised learning with coordinate gradient descent ⋮ Iteratively reweighted \(\ell_1\)-penalized robust regression
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Estimation and testing under sparsity. École d'Été de Probabilités de Saint-Flour XLV -- 2015
- A new method for estimation and model selection: \(\rho\)-estimation
- Sub-Gaussian mean estimators
- Statistics for high-dimensional data. Methods, theory and applications.
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Robust linear least squares regression
- Stabilité et instabilité du risque minimax pour des variables indépendantes équidistribuées
- SLOPE-adaptive variable selection via convex optimization
- Random generation of combinatorial structures from a uniform distribution
- The space complexity of approximating the frequency moments
- Smooth discrimination analysis
- Regularization and the small-ball method. I: Sparse recovery
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Optimal aggregation of classifiers in statistical learning.
- Robust low-rank matrix estimation
- Slope meets Lasso: improved oracle bounds and optimality
- ERM and RERM are optimal estimators for regression problems when malicious outliers corrupt the labels
- Simultaneous analysis of Lasso and Dantzig selector
- Gaussian averages of interpolated bodies and applications to approximate reconstruction
- Empirical minimization
- Local Rademacher complexities
- The Group Lasso for Logistic Regression
- Towards the study of least squares estimators with convex penalty
- Atomic Norm Denoising With Applications to Line Spectral Estimation
- Regularization and the small-ball method II: complexity dependent error rates
- Sparsity and Smoothness Via the Fused Lasso
- Tikhonov Regularization and Total Least Squares
- Living on the edge: phase transitions in convex programs with random data
- On Sparsity Inducing Regularization Methods for Machine Learning
- On Multiplier Processes Under Weak Moment Assumptions
- Upper and Lower Bounds for Stochastic Processes
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- An Iterative Regularization Method for Total Variation-Based Image Restoration
- Convexity, Classification, and Risk Bounds
- A fast unified algorithm for solving group-lasso penalize learning problems
- Structured sparsity through convex optimization
- Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation
This page was built for publication: