The \(L_1\) penalized LAD estimator for high dimensional linear regression
From MaRDI portal
Publication:391806
DOI10.1016/j.jmva.2013.04.001zbMath1279.62144arXiv1202.6347OpenAlexW1978901787MaRDI QIDQ391806
Publication date: 13 January 2014
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1202.6347
Related Items
A smoothing iterative method for quantile regression with nonconvex \(\ell_p\) penalty, A descent algorithm for constrained LAD-Lasso estimation with applications in portfolio selection, An efficient semismooth Newton method for adaptive sparse signal recovery problems, Penalized and constrained LAD estimation in fixed and high dimension, Robust sparse regression by modeling noise as a mixture of gaussians, Quantile regression for single-index-coefficient regression models, Fast Algorithms for LS and LAD-Collaborative Regression, High-Dimensional Learning Under Approximate Sparsity with Applications to Nonsmooth Estimation and Regularized Neural Networks, Robust error density estimation in ultrahigh dimensional sparse linear model, A proximal dual semismooth Newton method for zero-norm penalized quantile regression estimator, The robust nearest shrunken centroids classifier for high-dimensional heavy-tailed data, Oracle Estimation of a Change Point in High-Dimensional Quantile Regression, Variable selection and parameter estimation via WLAD-SCAD with a diverging number of parameters, Adaptive elastic net-penalized quantile regression for variable selection, Gradient projection Newton pursuit for sparsity constrained optimization, Low rank matrix recovery with impulsive noise, Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors, Sure independence screening for analyzing supersaturated designs, Robust change point detection method via adaptive LAD-Lasso, Penalised robust estimators for sparse and high-dimensional linear models, High-dimensional robust regression with \(L_q\)-loss functions, Adaptive LASSO model selection in a multiphase quantile regression, The linearized alternating direction method of multipliers for sparse group LAD model, A descent method for least absolute deviation Lasso problems, A new active zero set descent algorithm for least absolute deviation with generalized LASSO penalty, Wild bootstrap inference for penalized quantile regression for longitudinal data, Sparse quantile regression, A semi-parametric approach to feature selection in high-dimensional linear regression models, A null-space-based weightedl1minimization approach to compressed sensing, Fused Lasso penalized least absolute deviation estimator for high dimensional linear regression, Asymptotic risk and phase transition of \(l_1\)-penalized robust estimator, Adaptive robust variable selection, A Tuning-free Robust and Efficient Approach to High-dimensional Regression, Adaptive Huber Regression, Iterative reweighted methods for \(\ell _1-\ell _p\) minimization, Pivotal estimation via square-root lasso in nonparametric regression, Double fused Lasso penalized LAD for matrix regression, A Projection Based Conditional Dependence Measure with Applications to High-dimensional Undirected Graphical Models, Faster subgradient methods for functions with Hölderian growth, Scale calibration for high-dimensional robust regression, High-dimensional robust approximated \(M\)-estimators for mean regression with asymmetric data, Group penalized quantile regression, Unnamed Item, ℓ 1 − αℓ 2 minimization methods for signal and image reconstruction with impulsive noise removal, Robust moderately clipped LASSO for simultaneous outlier detection and variable selection, Adaptive iterative hard thresholding for least absolute deviation problems with sparsity constraints, Unnamed Item, Sparse Solutions of a Class of Constrained Optimization Problems, Low rank matrix recovery with adversarial sparse noise*
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Gaussian hare and the Laplacian tortoise: computability of squared-error versus absolute-error estimators. With comments by Ronald A. Thisted and M. R. Osborne and a rejoinder by the authors
- New volume ratio properties for convex symmetric bodies in \({\mathbb{R}}^ n\)
- Robust regression through the Huber's criterion and adaptive lasso penalty
- Simultaneous analysis of Lasso and Dantzig selector
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Decoding by Linear Programming
- Limit Theorems for Moderate Deviation Probabilities
- Asymptotic Theory of Least Absolute Error Regression
- Shifting Inequality and Recovery of Sparse Signals
- Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise
- Asymptotic Analysis of Robust LASSOs in the Presence of Noise With Large Variance
- New Bounds for Restricted Isometry Constants
- Probability Inequalities for Sums of Bounded Random Variables
- Robust Statistics
- Compressed sensing