Partitioned Approach for High-dimensional Confidence Intervals with Large Split Sizes
From MaRDI portal
Publication:5037796
DOI10.5705/ss.202018.0379OpenAlexW3038058171MaRDI QIDQ5037796
Jiarui Zhang, Yang Li, Zemin Zheng, Yao-hua Wu
Publication date: 4 March 2022
Published in: Statistica Sinica (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.5705/ss.202018.0379
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Nearly unbiased variable selection under minimax concave penalty
- A unified approach to model selection and sparse recovery using regularized least squares
- Confidence intervals for high-dimensional inverse covariance estimation
- The Adaptive Lasso and Its Oracle Properties
- Exact post-selection inference, with application to the Lasso
- A partially linear framework for massive heterogeneous data
- Rejoinder: One-step sparse estimates in nonconcave penalized likelihood models
- Distributed testing and estimation under sparse high dimensional models
- Least angle regression. (With discussion)
- A significance test for the lasso
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Divide and Conquer Kernel Ridge Regression: A Distributed Algorithm with Minimax Optimal Rates
- Asymptotic Equivalence of Regularization Methods in Thresholded Parameter Space
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Scaled sparse linear regression
- A split-and-conquer approach for analysis of
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Regularization after retention in ultrahigh dimensional linear regression models
- Variance Estimation Using Refitted Cross-Validation in Ultrahigh Dimensional Regression
- Computational Limits of A Distributed Algorithm For Smoothing Spline
- Model Selection for High-Dimensional Quadratic Regression via Regularization
- A Scalable Bootstrap for Massive Data
- Nonparametric Bayesian Aggregation for Massive Data
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Regularization and Variable Selection Via the Elastic Net
- High-Dimensional Variable Selection With Reciprocal L1-Regularization
- Distributed Matrix Completion and Robust Factorization
- High Dimensional Thresholded Regression and Shrinkage Effect
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- Split Sample Methods for Constructing Confidence Intervals for Binomial and Poisson Parameters
This page was built for publication: Partitioned Approach for High-dimensional Confidence Intervals with Large Split Sizes