Robust communication-efficient distributed composite quantile regression and variable selection for massive data
From MaRDI portal
Publication:2242035
DOI10.1016/j.csda.2021.107262OpenAlexW3156624220MaRDI QIDQ2242035
Shaomin Li, Benle Zhang, Kang-Ning Wang
Publication date: 9 November 2021
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2021.107262
Related Items (4)
Communication-efficient distributed estimation of partially linear additive models for large-scale data ⋮ A communication efficient distributed one-step estimation ⋮ Renewable composite quantile method and algorithm for nonparametric models with streaming data ⋮ Communication‐efficient low‐dimensional parameter estimation and inference for high‐dimensional Lp$$ {L}^p $$‐quantile regression
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Single-index composite quantile regression with heteroscedasticity and general error distributions
- Estimation of linear composite quantile regression using EM algorithm
- Aggregated estimating equation estimation
- Composite quantile regression and the oracle model selection theory
- Distributed testing and estimation under sparse high dimensional models
- Composite quantile regression for correlated data
- Limiting distributions for \(L_1\) regression estimators under general conditions
- Weighted local linear composite quantile estimation for the case of general error distributions
- Robust and efficient estimator for simultaneous model structure identification and variable selection in generalized partial linear varying coefficient models with longitudinal data
- Quantile regression under memory constraint
- Quantile regression in big data: a divide and conquer based strategy
- Distributed inference for quantile regression processes
- Divide and Conquer Kernel Ridge Regression: A Distributed Algorithm with Minimax Optimal Rates
- A split-and-conquer approach for analysis of
- A note on automatic variable selection using smooth-threshold estimating equations
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Composite quantile regression for massive datasets
- On the optimality of averaging in distributed statistical learning
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Local Composite Quantile Regression Smoothing: An Efficient and Safe Alternative to Local Polynomial Regression
- Communication-Efficient Distributed Statistical Inference
- Standard errors and covariance matrices for smoothed rank estimators
This page was built for publication: Robust communication-efficient distributed composite quantile regression and variable selection for massive data