Combining a relaxed EM algorithm with Occam's razor for Bayesian variable selection in high-dimensional regression
From MaRDI portal
Publication:268752
DOI10.1016/j.jmva.2015.09.004zbMath1334.62114OpenAlexW1803361327MaRDI QIDQ268752
Julien Chiquet, Pierre Latouche, Pierre-Alexandre Mattei, Charles Bouveyron
Publication date: 15 April 2016
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jmva.2015.09.004
Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).
Related Items (3)
An introduction to recent advances in high/infinite dimensional statistics ⋮ A novel variational Bayesian method for variable selection in logistic regression models ⋮ Bayesian variable selection for globally sparse probabilistic PCA
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bayesian variable selection with shrinking and diffusing priors
- The Adaptive Lasso and Its Oracle Properties
- The discriminative functional mixture model for a comparative analysis of bike sharing systems
- A study of variable selection using \(g\)-prior distribution with ridge parameter
- Variable selection in infinite-dimensional problems
- Exponential screening and optimal rates of sparse estimation
- Bayes and empirical-Bayes multiplicity adjustment in the variable-selection problem
- A majorization-minimization approach to variable selection using spike and slab priors
- On the distribution of penalized maximum likelihood estimators: the LASSO, SCAD, and thresholding
- On the convergence properties of the EM algorithm
- Least angle regression. (With discussion)
- PAC-Bayesian bounds for sparse regression estimation with exponential weights
- Spike and slab variable selection: frequentist and Bayesian strategies
- Aggregation for Gaussian regression
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Functional data analysis.
- Nonparametric functional data analysis. Theory and practice.
- Regularization in regression: comparing Bayesian and frequentist methods in a poorly informative situation
- Calibration and empirical Bayes variable selection
- Bayes and empirical Bayes: do they merge?
- Optimization with Sparsity-Inducing Penalties
- The EM Algorithm and Extensions, 2E
- Mixtures of g Priors for Bayesian Variable Selection
- Estimating Optimal Transformations for Multiple Regression and Correlation
- Bayesian Variable Selection in Linear Regression
- Atomic Decomposition by Basis Pursuit
- 10.1162/15324430152748236
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Sparse Approximate Solutions to Linear Systems
- Bayes Factors
- A Limited Memory Algorithm for Bound Constrained Optimization
- Bayesian Model Selection in High-Dimensional Settings
- EMVS: The EM Approach to Bayesian Variable Selection
- Regularization and Variable Selection Via the Elastic Net
- Spike and Slab Gene Selection for Multigroup Microarray Data
- A review of Bayesian variable selection methods: what, how and which
This page was built for publication: Combining a relaxed EM algorithm with Occam's razor for Bayesian variable selection in high-dimensional regression