Sparse regression and support recovery with \(\mathbb{L}_2\)-boosting algorithms
From MaRDI portal
Publication:466526
DOI10.1016/j.jspi.2014.07.006zbMath1307.62175OpenAlexW2093958718MaRDI QIDQ466526
Magali Champion, Christine Cierco-Ayrolles, Sébastien Gadat, Matthieu Vignes
Publication date: 27 October 2014
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jspi.2014.07.006
Related Items
Generalized Sobol sensitivity indices for dependent variables: numerical methods ⋮ Boosting as a kernel-based method ⋮ Optimization by Gradient Boosting
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Limiting laws of coherence of random matrices with applications to testing covariance structure and construction of compressed sensing matrices
- Oracle inequalities and optimal inference under group sparsity
- Network inference and biological dynamics
- On performance of greedy algorithms
- Regularized multivariate regression for identifying master predictors with application to integrative genomics study of breast cancer
- Input selection and shrinkage in multiresponse linear regression
- Additive logistic regression: a statistical view of boosting. (With discussion and a rejoinder by the authors)
- Minimax risks for sparse regressions: ultra-high dimensional phenomenons
- Weak greedy algorithms
- Support union recovery in high-dimensional multivariate regression
- Beyond sparsity: recovering structured representations by \({\ell}^1\) minimization and greedy algorithms
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Boosting for high-dimensional linear models
- Compressed Sensing: How Sharp Is the Restricted Isometry Property?
- Decoding by Linear Programming
- Greed is Good: Algorithmic Results for Sparse Approximation
- Jump Diffusion over Feature Space for Object Recognition
- Developments in Linear Regression Methodology: 1959-1982
- Boosting With theL2Loss
- Selection bias in gene extraction on the basis of microarray gene-expression data
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Average Case Analysis of Multichannel Sparse Recovery Using Convex Relaxation
- Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise
- Regularization and Variable Selection Via the Elastic Net
- Multi-task Regression using Minimal Penalties
- Model Selection and Estimation in Regression with Grouped Variables
- Random forests
- Gene selection for cancer classification using support vector machines