Quasi-likelihood and/or robust estimation in high dimensions
From MaRDI portal
Publication:5965304
DOI10.1214/12-STS397zbMath1331.62354arXiv1206.6721MaRDI QIDQ5965304
Patric Müller, Sara van de Geer
Publication date: 3 March 2016
Published in: Statistical Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1206.6721
Ridge regression; shrinkage estimators (Lasso) (62J07) Generalized linear models (logistic models) (62J12)
Related Items
Predictive functional linear models with diverging number of semiparametric single-index interactions, High-dimensional robust regression with \(L_q\)-loss functions, Test of significance for high-dimensional longitudinal data, Finite-sample analysis of \(M\)-estimators using self-concordance, Adaptive robust variable selection, On asymptotically optimal confidence regions and tests for high-dimensional models, Introduction to the special issue on sparsity and regularization methods, Quasi-likelihood and/or robust estimation in high dimensions, Penalized robust estimators in sparse logistic regression, High dimensional generalized linear models for temporal dependent data
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Statistics for high-dimensional data. Methods, theory and applications.
- The Dantzig selector and sparsity oracle inequalities
- \(\ell_{1}\)-penalization for mixture regression models
- Sparsity in penalized empirical risk minimization
- Least squares estimation with complexity penalties
- About the constants in Talagrand's concentration inequalities for empirical processes.
- On the conditions used to prove oracle results for the Lasso
- The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)
- Robust regression through the Huber's criterion and adaptive lasso penalty
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- Aggregation for Gaussian regression
- High-dimensional graphs and variable selection with the Lasso
- Classifiers of support vector machine type with \(\ell_1\) complexity regularization
- Estimation for High-Dimensional Linear Mixed-Effects Models Using ℓ1-Penalization
- Robust principal component analysis?
- Adaptive estimation with soft thresholding penalties
- De-noising by soft-thresholding
- Accuracy Guarantees for <formula formulatype="inline"> <tex Notation="TeX">$\ell_1$</tex></formula>-Recovery
- Aggregation and Sparsity Via ℓ1 Penalized Least Squares
- Sparse Density Estimation with ℓ1 Penalties
- The Lasso, correlated design, and improved oracle inequalities
- Quasi-likelihood and/or robust estimation in high dimensions
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers