Deprecated: $wgMWOAuthSharedUserIDs=false is deprecated, set $wgMWOAuthSharedUserIDs=true, $wgMWOAuthSharedUserSource='local' instead [Called from MediaWiki\HookContainer\HookContainer::run in /var/www/html/w/includes/HookContainer/HookContainer.php at line 135] in /var/www/html/w/includes/Debug/MWDebug.php on line 372
The \(L_1\) penalized LAD estimator for high dimensional linear regression - MaRDI portal

The \(L_1\) penalized LAD estimator for high dimensional linear regression

From MaRDI portal
Publication:391806

DOI10.1016/j.jmva.2013.04.001zbMath1279.62144arXiv1202.6347OpenAlexW1978901787MaRDI QIDQ391806

Lie Wang

Publication date: 13 January 2014

Published in: Journal of Multivariate Analysis (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1202.6347



Related Items

A smoothing iterative method for quantile regression with nonconvex \(\ell_p\) penalty, A descent algorithm for constrained LAD-Lasso estimation with applications in portfolio selection, An efficient semismooth Newton method for adaptive sparse signal recovery problems, Penalized and constrained LAD estimation in fixed and high dimension, Robust sparse regression by modeling noise as a mixture of gaussians, Quantile regression for single-index-coefficient regression models, Fast Algorithms for LS and LAD-Collaborative Regression, High-Dimensional Learning Under Approximate Sparsity with Applications to Nonsmooth Estimation and Regularized Neural Networks, Robust error density estimation in ultrahigh dimensional sparse linear model, A proximal dual semismooth Newton method for zero-norm penalized quantile regression estimator, The robust nearest shrunken centroids classifier for high-dimensional heavy-tailed data, Oracle Estimation of a Change Point in High-Dimensional Quantile Regression, Variable selection and parameter estimation via WLAD-SCAD with a diverging number of parameters, Adaptive elastic net-penalized quantile regression for variable selection, Gradient projection Newton pursuit for sparsity constrained optimization, Low rank matrix recovery with impulsive noise, Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors, Sure independence screening for analyzing supersaturated designs, Robust change point detection method via adaptive LAD-Lasso, Penalised robust estimators for sparse and high-dimensional linear models, High-dimensional robust regression with \(L_q\)-loss functions, Adaptive LASSO model selection in a multiphase quantile regression, The linearized alternating direction method of multipliers for sparse group LAD model, A descent method for least absolute deviation Lasso problems, A new active zero set descent algorithm for least absolute deviation with generalized LASSO penalty, Wild bootstrap inference for penalized quantile regression for longitudinal data, Sparse quantile regression, A semi-parametric approach to feature selection in high-dimensional linear regression models, A null-space-based weightedl1minimization approach to compressed sensing, Fused Lasso penalized least absolute deviation estimator for high dimensional linear regression, Asymptotic risk and phase transition of \(l_1\)-penalized robust estimator, Adaptive robust variable selection, A Tuning-free Robust and Efficient Approach to High-dimensional Regression, Adaptive Huber Regression, Iterative reweighted methods for \(\ell _1-\ell _p\) minimization, Pivotal estimation via square-root lasso in nonparametric regression, Double fused Lasso penalized LAD for matrix regression, A Projection Based Conditional Dependence Measure with Applications to High-dimensional Undirected Graphical Models, Faster subgradient methods for functions with Hölderian growth, Scale calibration for high-dimensional robust regression, High-dimensional robust approximated \(M\)-estimators for mean regression with asymmetric data, Group penalized quantile regression, Unnamed Item, 1αℓ 2 minimization methods for signal and image reconstruction with impulsive noise removal, Robust moderately clipped LASSO for simultaneous outlier detection and variable selection, Adaptive iterative hard thresholding for least absolute deviation problems with sparsity constraints, Unnamed Item, Sparse Solutions of a Class of Constrained Optimization Problems, Low rank matrix recovery with adversarial sparse noise*



Cites Work