Bayesian inversion with Student's \(t\) priors based on Gaussian scale mixtures
From MaRDI portal
Publication:6641742
DOI10.1088/1361-6420/ad75afMaRDI QIDQ6641742
Felipe Uribe, Angelina Senchukova, Lassi Roininen
Publication date: 21 November 2024
Published in: Inverse Problems (Search for Journal in Brave)
Gibbs samplerMarkov random fieldsBayesian inverse problemsGaussian scale mixtureBayesian hierarchical modelingStudent's \(t\) distribution
Parametric inference (62Fxx) Inference from stochastic processes (62Mxx) Probabilistic methods, stochastic differential equations (65Cxx)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The No-U-Turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo
- Whittle-Matérn priors for Bayesian statistical inversion with applications in electrical impedance tomography
- Objective prior for the number of degrees of freedom of a \(t\) distribution
- Dimensionality reduction and polynomial chaos acceleration of Bayesian inference in inverse problems
- Simple conditions for the convergence of the Gibbs sampler and Metropolis-Hastings algorithms
- Hierarchical models with scale mixtures of normal distributions
- Statistical and computational inverse problems.
- Penalising model component complexity: a principled, practical approach to constructing priors
- Markov chains for exploring posterior distributions. (With discussion)
- Gaussian Markov random field priors for inverse problems
- Objective Bayesian analysis for the Student-\(t\) linear regression
- Rank-normalization, folding, and localization: an improved \(\widehat{R}\) for assessing convergence of MCMC (with Discussion)
- Cauchy Markov random field priors for Bayesian inversion
- Discretization-invariant Bayesian inversion and Besov space priors
- Cauchy difference priors for edge-preserving Bayesian inversion
- Bayesian neural network priors for edge-preserving inversion
- Likelihood-informed dimension reduction for nonlinear inverse problems
- Model-Based Clustering of Non-Gaussian Panel Data Based on Skew-tDistributions
- The Mixture of Normal Distributions with Different Variances
- A Gaussian hypermodel to recover blocky objects
- Hypermodels in the Bayesian imaging framework
- Integral equation models for image restoration: high accuracy methods and fast algorithms
- Objective Bayesian analysis for the Student-t regression model
- Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
- On scale mixtures of normal distributions
- On Bayesian Modeling of Fat Tails and Skewness
- Gaussian Markov Random Fields
- Sparse Online Variational Bayesian Regression
- Edge adaptive hybrid regularization model for image deblurring
- Sparsity Promoting Hybrid Solvers for Hierarchical Bayesian Inverse Problems
- Sparse reconstructions from few noisy data: analysis of hierarchical Bayesian models with generalized gamma hyperpriors
- Discrete Inverse Problems
- An Introduction to Data Analysis and Uncertainty Quantification for Inverse Problems
- Estimation and decision for linear systems with elliptical random processes
- Laplace-distributed increments, the Laplace prior, and edge-preserving regularization
- MCMC methods for functions: modifying old algorithms to make them faster
- Bayesian inversion with α-stable priors
- Horseshoe Priors for Edge-Preserving Linear Bayesian Inversion
- Geometry parameter estimation for sparse X-ray log imaging
- Bayesian inference with subset simulation in varying dimensions applied to the Karhunen-Loève expansion
This page was built for publication: Bayesian inversion with Student's \(t\) priors based on Gaussian scale mixtures