Scalable importance sampling estimation of Gaussian mixture posteriors in Bayesian networks
DOI10.1016/j.ijar.2018.06.004zbMath1453.62310OpenAlexW2807754335MaRDI QIDQ1783942
Helge Langseth, Antonio Salmerón, Anders L. Madsen, Darío Ramos-López, Thomas D. Nielsen, Rafael Rumí, Andrés R. Masegosa
Publication date: 21 September 2018
Published in: International Journal of Approximate Reasoning (Search for Journal in Brave)
Full work available at URL: http://hdl.handle.net/11250/2596038
Bayesian networksimportance samplingGaussian mixturesscalable inferenceconditional linear Gaussian models
Computational methods for problems pertaining to statistics (62-08) Gaussian processes (60G15) Sampling theory, sample surveys (62D05)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Approximate probability propagation with mixtures of truncated exponentials
- Graphical models for associations between variables, some of which are qualitative and some quantitative
- Dynamic importance sampling in Bayesian networks based on probability trees
- Anytime anyspace probabilistic inference
- Binary join trees for computing marginals in the Shenoy-Shafer architecture
- Lazy propagation: A junction tree inference algorithm based on lazy evaluation
- Scaling up Bayesian variational inference using distributed computing clusters
- Importance sampling algorithms for Bayesian networks: principles and performance
- Bayesian Networks and Decision Graphs
- On Information and Sufficiency
- A Stochastic Approximation Method
This page was built for publication: Scalable importance sampling estimation of Gaussian mixture posteriors in Bayesian networks