Nonparametric screening under conditional strictly convex loss for ultrahigh dimensional sparse data
From MaRDI portal
Publication:2313277
DOI10.1214/18-AOS1738zbMath1421.62044arXiv1912.01157MaRDI QIDQ2313277
Publication date: 18 July 2019
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1912.01157
sure screening propertyconditional strictly convex lossgoodness-of-fit nonparametric screeningultrahigh dimensional variable selection
Applications of statistics to biology and medical sciences; meta analysis (62P10) Nonparametric statistical resampling methods (62G09)
Related Items (3)
RaSE: A Variable Screening Framework via Random Subspace Ensembles ⋮ Structure learning via unstructured kernel-based M-estimation ⋮ Nonparametric screening under conditional strictly convex loss for ultrahigh dimensional sparse data
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Model-Free Feature Screening for Ultrahigh-Dimensional Data
- Sure independence screening in generalized linear models with NP-dimensionality
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- Marginal empirical likelihood and sure independence feature screening
- Robust rank correlation based screening
- Asymptotic normality of maximum quasi-likelihood estimators in generalized linear models with fixed design
- Principled sure independence screening for Cox models with ultra-high-dimensional covariates
- One-step sparse estimates in nonconcave penalized likelihood models
- High-dimensional classification using features annealed independence rules
- High-dimensional additive modeling
- The dimensionality reduction principle for generalized additive models
- A practical guide to splines
- Quasi-likelihood and its application. A general approach to optimal parameter estimation
- A decision-theoretic generalization of on-line learning and an application to boosting
- Adaptive estimation of a quadratic functional by model selection.
- Quantile-adaptive model-free variable screening for high-dimensional heterogeneous data
- Nonparametric screening under conditional strictly convex loss for ultrahigh dimensional sparse data
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- The fused Kolmogorov filter: a nonparametric model-free screening method
- Censored rank independence screening for high-dimensional survival data
- Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Additive Models
- New aspects of Bregman divergence in regression and classification with parametric and nonparametric estimation
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Permutation Tests for Linear Models
- Regularization after retention in ultrahigh dimensional linear regression models
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Feature Screening via Distance Correlation Learning
- Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Varying Coefficient Models
- Regularization and Variable Selection Via the Elastic Net
- A Road to Classification in High Dimensional Space: The Regularized Optimal Affine Discriminant
This page was built for publication: Nonparametric screening under conditional strictly convex loss for ultrahigh dimensional sparse data