Broken adaptive ridge regression and its asymptotic properties
From MaRDI portal
Publication:1795597
DOI10.1016/j.jmva.2018.08.007zbMath1401.62108DBLPjournals/ma/DaiCSLL18OpenAlexW2888331625WikidataQ92615664 ScholiaQ92615664MaRDI QIDQ1795597
Linlin Dai, Zhenqiu Liu, Kani Chen, Gang Li, Zhi-Hua Sun
Publication date: 16 October 2018
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: http://europepmc.org/articles/pmc6430210
Related Items (8)
A scalable surrogate \(L_0\) sparse regression method for generalized linear models with applications to large scale data ⋮ Variable selection for case-cohort studies with informatively interval-censored outcomes ⋮ Ridge regression revisited: debiasing, thresholding and bootstrap ⋮ Scalable Algorithms for Large Competing Risks Data ⋮ Weighted least squares model averaging for accelerated failure time models ⋮ A general adaptive ridge regression method for generalized linear models: an iterative re-weighting approach ⋮ Broken adaptive ridge regression for right-censored survival data ⋮ Smoothly adaptively centered ridge estimator
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Model-Free Feature Screening for Ultrahigh-Dimensional Data
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- On constrained and regularized high-dimensional regression
- Robust rank correlation based screening
- Iterative hard thresholding for compressed sensing
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Estimating the dimension of a model
- Heuristics of instability and stabilization in model selection
- Model free feature screening for ultrahigh dimensional data with responses missing at random
- Adaptive conditional feature screening
- Asymptotics for Lasso-type estimators.
- The risk inflation criterion for multiple regression
- Quantile-adaptive model-free variable screening for high-dimensional heterogeneous data
- Efficient regularized regression with \(L_0\) penalty for variable selection and network construction
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- High-dimensional graphs and variable selection with the Lasso
- Strong oracle optimality of folded concave penalized estimation
- Forward Regression for Ultra-High Dimensional Variable Screening
- Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Additive Models
- Extended Bayesian information criteria for model selection with large model spaces
- Regularized quantile regression and robust feature screening for single index models
- Iteratively reweighted least squares minimization for sparse recovery
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Recovering Sparse Signals With a Certain Family of Nonconvex Penalties and DC Programming
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Likelihood-Based Selection and Sharp Parameter Estimation
- Feature Selection for Varying Coefficient Models With Ultrahigh-Dimensional Covariates
- The Sparse MLE for Ultrahigh-Dimensional Feature Screening
- Regularization and Variable Selection Via the Elastic Net
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- Some Comments on C P
- A new look at the statistical model identification
This page was built for publication: Broken adaptive ridge regression and its asymptotic properties