Adaptive Dantzig density estimation
From MaRDI portal
Publication:629798
DOI10.1214/09-AIHP351zbMath1207.62077arXiv0905.0884MaRDI QIDQ629798
Karine Bertin, Erwan Le Pennec, Vincent Rivoirard
Publication date: 10 March 2011
Published in: Annales de l'Institut Henri Poincaré. Probabilités et Statistiques (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0905.0884
calibrationdensity estimationsparsityconcentration inequalitiesoracle inequalitiesdictionaryDantzig estimatelasso estimate
Density estimation (62G07) Asymptotic properties of nonparametric inference (62G20) Nonparametric estimation (62G05)
Related Items (14)
Penalized logspline density estimation using total variation penalty ⋮ Oracle inequalities for the Lasso in the high-dimensional Aalen multiplicative intensity model ⋮ Compressive Gaussian Mixture Estimation ⋮ Compressive statistical learning with random feature moments ⋮ Lasso-type estimators for semiparametric nonlinear mixed-effects models estimation ⋮ Estimator selection: a new method with applications to kernel density estimation ⋮ Lasso in Infinite dimension: application to variable selection in functional multivariate linear regression ⋮ Adaptive estimation in the nonparametric random coefficients binary choice model by needlet thresholding ⋮ High-dimensional additive hazards models and the lasso ⋮ Optimal Kullback-Leibler aggregation in mixture density estimation by maximum likelihood ⋮ Sparse recovery from extreme eigenvalues deviation inequalities ⋮ A non-asymptotic approach for model selection via penalization in high-dimensional mixture of experts models ⋮ Lasso and probabilistic inequalities for multivariate point processes ⋮ Adaptive estimation of the baseline hazard function in the Cox model by model selection, with high-dimensional covariates
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Adaptive density estimation: A curse of support?
- Near-ideal model selection by \(\ell _{1}\) minimization
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- SPADES and mixture models
- Lasso-type recovery of sparse representations for high-dimensional data
- On minimax density estimation on \(\mathbb R\)
- Asymptotics for Lasso-type estimators.
- Least angle regression. (With discussion)
- Near optimal thresholding estimation of a Poisson intensity on the real line
- Minimal penalties for Gaussian model selection
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- Aggregation for Gaussian regression
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Atomic Decomposition by Basis Pursuit
- Stable recovery of sparse overcomplete representations in the presence of noise
- Ideal spatial adaptation by wavelet shrinkage
- A new approach to variable selection in least squares problems
- Aggregation and Sparsity Via ℓ1 Penalized Least Squares
- Sparse Density Estimation with ℓ1 Penalties
This page was built for publication: Adaptive Dantzig density estimation