Multi-stage convex relaxation for feature selection
From MaRDI portal
Publication:2435243
DOI10.3150/12-BEJ452zbMath1359.62293arXiv1106.0565OpenAlexW2963172671MaRDI QIDQ2435243
Publication date: 4 February 2014
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1106.0565
Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Gaining Outlier Resistance With Progressive Quantiles: Fast Algorithms and Theoretical Studies, Variable selection and parameter estimation with the Atan regularization method, A unifying framework of high-dimensional sparse estimation with difference-of-convex (DC) regularizations, A solution approach for cardinality minimization problem based on fractional programming, Relaxed sparse eigenvalue conditions for sparse estimation via non-convex regularized regression, Separating variables to accelerate non-convex regularized optimization, Smoothing neural network for \(L_0\) regularized optimization problem with general convex constraints, Efficient nonconvex sparse group feature selection via continuous and discrete optimization, Weak Signal Identification and Inference in Penalized Likelihood Models for Categorical Responses, An extrapolated proximal iteratively reweighted method for nonconvex composite optimization problems, Calibrating nonconvex penalized regression in ultra-high dimension, Optimal computational and statistical rates of convergence for sparse nonconvex learning problems, Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis, Pathwise coordinate optimization for sparse learning: algorithm and theory, Proximal gradient method with automatic selection of the parameter by automatic differentiation, Strong oracle optimality of folded concave penalized estimation, Sparse classification: a scalable discrete optimization perspective, Unnamed Item, Towards Statistically Provable Geometric 3D Human Pose Recovery
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- Sparsity in penalized empirical risk minimization
- One-step sparse estimates in nonconcave penalized likelihood models
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- Sparsity oracle inequalities for the Lasso
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- High-dimensional graphs and variable selection with the Lasso
- Decoding by Linear Programming
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Adaptive Forward-Backward Greedy Algorithm for Learning Sparse Representations