High-dimensional composite quantile regression: optimal statistical guarantees and fast algorithms
From MaRDI portal
Publication:6184871
DOI10.1214/23-ejs2147arXiv2208.09817MaRDI QIDQ6184871
No author found.
Publication date: 5 January 2024
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2208.09817
asymptotic efficiencyhigh-dimensional datasparsityoracle propertycomposite quantile regressionconvolution smoothing
Ridge regression; shrinkage estimators (Lasso) (62J07) Foundations and philosophical topics in statistics (62A01)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- Minimum distance Lasso for robust high-dimensional regression
- Statistics for high-dimensional data. Methods, theory and applications.
- Parametric estimation. Finite sample theory
- Composite quantile regression and the oracle model selection theory
- One-step sparse estimates in nonconcave penalized likelihood models
- Parallelizing the dual revised simplex method
- I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error
- Weak convergence and empirical processes. With applications to statistics
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
- Simultaneous analysis of Lasso and Dantzig selector
- \(\ell_1\)-penalized quantile regression in high-dimensional sparse models
- Smoothed quantile regression with large-scale inference
- Optimization with Sparsity-Inducing Penalties
- Reconstruction From Anisotropic Random Measurements
- Adaptive Huber Regression
- Regression Quantiles
- Atomic Decomposition by Basis Pursuit
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- High-Dimensional Statistics
- Local Composite Quantile Regression Smoothing: An Efficient and Safe Alternative to Local Polynomial Regression
- Penalized Composite Quasi-Likelihood for Ultrahigh Dimensional Variable Selection
- High-Dimensional Probability
- Model Selection via Bayesian Information Criterion for Quantile Regression Models
- Statistical Foundations of Data Science
- Sparse Composite Quantile Regression in Ultrahigh Dimensions With Tuning Parameter Calibration
- Rejoinder to “A Tuning-Free Robust and Efficient Approach to High-Dimensional Regression”
- Robust Variable Selection With Exponential Squared Loss
- Estimation of High Dimensional Mean Regression in the Absence of Symmetry and Light Tail Assumptions
- Robust and consistent variable selection in high-dimensional generalized linear models
- High-Dimensional Quantile Regression: Convolution Smoothing and Concave Regularization
- A general theory of concave regularization for high-dimensional sparse estimation problems
This page was built for publication: High-dimensional composite quantile regression: optimal statistical guarantees and fast algorithms