High-dimensional robust regression with \(L_q\)-loss functions
From MaRDI portal
Publication:2674525
DOI10.1016/j.csda.2022.107567OpenAlexW4285498631WikidataQ113877315 ScholiaQ113877315MaRDI QIDQ2674525
Rohana J. Karunamuni, Yi-Bo Wang
Publication date: 14 September 2022
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2022.107567
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The \(L_1\) penalized LAD estimator for high dimensional linear regression
- Statistics for high-dimensional data. Methods, theory and applications.
- One-step sparse estimates in nonconcave penalized likelihood models
- \(M\)-estimation of linear models with dependent errors
- Lasso-type recovery of sparse representations for high-dimensional data
- A minimax-bias property of the least \(\alpha\)-quantile estimates
- Robust and sparse estimators for linear regression models
- A class of robust and fully efficient regression estimators
- Nonconcave penalized likelihood with a diverging number of parameters.
- Weak convergence and empirical processes. With applications to statistics
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
- Robust statistical learning with Lipschitz and convex loss functions
- Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions
- Simultaneous analysis of Lasso and Dantzig selector
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- Adaptive robust variable selection
- Adaptive Huber Regression
- Bounding the Smallest Singular Value of a Random Matrix Without Concentration
- Convex half-quadratic criteria and interacting auxiliary variables for image restoration
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Penalized Composite Quasi-Likelihood for Ultrahigh Dimensional Variable Selection
- Quantile Regression for Analyzing Heterogeneity in Ultra-High Dimension
- Robust Variable Selection With Exponential Squared Loss
- Estimation of High Dimensional Mean Regression in the Absence of Symmetry and Light Tail Assumptions
- Robust reduced-rank regression
- Analysis of Half-Quadratic Minimization Methods for Signal and Image Recovery
- Robust Estimation of a Location Parameter
- Robust Statistics
- Quasi-likelihood and/or robust estimation in high dimensions
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
This page was built for publication: High-dimensional robust regression with \(L_q\)-loss functions