Hierachical Bayesian models and sparsity: ℓ 2 -magic
From MaRDI portal
Publication:4625206
DOI10.1088/1361-6420/aaf5abzbMath1490.62078OpenAlexW2910526316MaRDI QIDQ4625206
Erkki Somersalo, Alexander Strang, Daniela Calvetti
Publication date: 22 February 2019
Published in: Inverse Problems (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1088/1361-6420/aaf5ab
Bayesian inference (62F15) Ill-posedness and regularization problems in numerical linear algebra (65F22) Numerical optimization and variational techniques (65K10)
Related Items (24)
A Variational Inference Approach to Inverse Problems with Gamma Hyperpriors ⋮ Sparse Online Variational Bayesian Regression ⋮ Sampling-based Spotlight SAR Image Reconstruction from Phase History Data for Speckle Reduction and Uncertainty Quantification ⋮ Automatic fidelity and regularization terms selection in variational image restoration ⋮ Empirical Bayesian Inference Using a Support Informed Prior ⋮ Sparsity promoting reconstructions via hierarchical prior models in diffuse optical tomography ⋮ Generalized Sparse Bayesian Learning and Application to Image Reconstruction ⋮ Sequential edge detection using joint hierarchical Bayesian learning ⋮ Adaptive anisotropic Bayesian meshing for inverse problems ⋮ Horseshoe Priors for Edge-Preserving Linear Bayesian Inversion ⋮ Sequential image recovery using joint hierarchical Bayesian learning ⋮ Inducing sparsity via the horseshoe prior in imaging problems ⋮ On and Beyond Total Variation Regularization in Imaging: The Role of Space Variance ⋮ Hierarchical ensemble Kalman methods with sparsity-promoting generalized gamma hyperpriors ⋮ Reconciling Bayesian and Perimeter Regularization for Binary Inversion ⋮ Maximum Likelihood Estimation of Regularization Parameters in High-Dimensional Inverse Problems: An Empirical Bayesian Approach Part I: Methodology and Experiments ⋮ Sparsity Promoting Hybrid Solvers for Hierarchical Bayesian Inverse Problems ⋮ Bayesian Mesh Adaptation for Estimating Distributed Parameters ⋮ Where Bayes tweaks Gauss: conditionally Gaussian priors for stable multi-dipole estimation ⋮ Sparse reconstructions from few noisy data: analysis of hierarchical Bayesian models with generalized gamma hyperpriors ⋮ Solving inverse problems using data-driven models ⋮ Overcomplete representation in a hierarchical Bayesian framework ⋮ A matrix-free fixed-point iteration for inverting cascade impactor measurements with instrument's sensitivity kernels and hardware ⋮ Bayesian hierarchical dictionary learning
Cites Work
- Unnamed Item
- Conditionally Gaussian Hypermodels for Cerebral Source Localization
- A Gaussian hypermodel to recover blocky objects
- A hierarchical Krylov–Bayes iterative inverse solver for MEG with physiological preconditioning
- Decoding by Linear Programming
- Stable recovery of sparse overcomplete representations in the presence of noise
- From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images
- Iteratively reweighted least squares minimization for sparse recovery
- Introduction to Bayesian Scientific Computing
- For most large underdetermined systems of equations, the minimal 𝓁1‐norm near‐solution approximates the sparsest near‐solution
- Stable signal recovery from incomplete and inaccurate measurements
This page was built for publication: Hierachical Bayesian models and sparsity: ℓ 2 -magic