The robust desparsified lasso and the focused information criterion for high-dimensional generalized linear models
From MaRDI portal
Publication:5880769
DOI10.1080/02331888.2022.2154769OpenAlexW4311134123MaRDI QIDQ5880769
S. C. Pandhare, T. V. Ramanathan
Publication date: 6 March 2023
Published in: Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02331888.2022.2154769
robust estimationgeneralized linear modelsfocused information criterionhigh-dimensional asymptoticsnodewise regressiondesparsified lasso
Cites Work
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Estimation and testing under sparsity. École d'Été de Probabilités de Saint-Flour XLV -- 2015
- Sparse estimators and the oracle property, or the return of Hodges' estimator
- Valid post-selection inference
- Statistics for high-dimensional data. Methods, theory and applications.
- Von Mises calculus for statistical functionals
- Robust covariance and scatter matrix estimation under Huber's contamination model
- Asymptotics for Lasso-type estimators.
- On the conditions used to prove oracle results for the Lasso
- Statistical consistency and asymptotic normality for high-dimensional robust \(M\)-estimators
- The robust focused information criterion for strong mixing stochastic processes with \(\mathscr{L}^2\)-differentiable parametric densities
- The de-biased group Lasso estimation for varying coefficient models
- Detangling robustness in high dimensions: composite versus model-averaged estimation
- Valid post-selection inference in model-free linear regression
- The focused information criterion for logistic time series regression models under locally biased estimating functions
- Influence functions for penalized M-estimators
- High-dimensional generalized linear models and the lasso
- Focused information criterion and model averaging for generalized additive partial linear models
- ORDER SELECTION IN ARMA MODELS USING THE FOCUSED INFORMATION CRITERION
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Adaptive Huber Regression
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Model Selection and Model Averaging
- PREDICTION‐FOCUSED MODEL SELECTION FOR AUTOREGRESSIVE MODELS
- The Influence Curve and Its Role in Robust Estimation
- Robust and efficient estimation by minimising a density power divergence
- Robust Inference for Generalized Linear Models
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Frequentist Model Average Estimators
- The Focused Information Criterion
- Model averaging for M-estimation
- A High‐dimensional Focused Information Criterion
- Robust Estimation via Robust Gradient Estimation
- Robust estimation via generalized quasi-gradients
- Nonconcave Penalized Likelihood With NP-Dimensionality
- Regularization and Variable Selection Via the Elastic Net
- Robust and consistent variable selection in high-dimensional generalized linear models
- An asymptotic theory for model selection inference in general semiparametric problems
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- Robust Estimation of a Location Parameter
- On the Assumptions Used to Prove Asymptotic Normality of Maximum Likelihood Estimates
- MODEL SELECTION AND INFERENCE: FACTS AND FICTION
- CHALLENGES FOR ECONOMETRIC MODEL SELECTION
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- Robust Statistics
- The focussed information criterion for generalised linear regression models for time series
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: The robust desparsified lasso and the focused information criterion for high-dimensional generalized linear models