Approximate inference in Bayesian networks: parameterized complexity results
DOI10.1016/j.ijar.2017.10.029zbMath1446.68076OpenAlexW2766800087MaRDI QIDQ1726381
Publication date: 20 February 2019
Published in: International Journal of Approximate Reasoning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.ijar.2017.10.029
samplingBayesian networksstochastic algorithmsapproximate inferencede-randomizationparameterized complexity
Reasoning under uncertainty in the context of artificial intelligence (68T37) Randomized algorithms (68W20) Probabilistic graphical models (62H22) Parameterized complexity, tractability and kernelization (68Q27)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Fundamentals of parameterized complexity
- Most probable explanations in Bayesian networks: complexity and tractability
- The complexity of computing the permanent
- Approximating probabilistic inference in Bayesian belief networks is NP- hard
- Complexity of probabilistic reasoning in directed-path singly-connected Bayes networks
- An optimal approximation algorithm for Bayesian inference
- Parametrized complexity theory.
- The computational complexity of probabilistic inference using Bayesian belief networks
- Tree-Width and the Computational Complexity of MAP Approximations in Bayesian Networks
- The Necessity of Bounded Treewidth for Efficient Inference in Bayesian Networks
- The polynomial-time hierarchy and sparse oracles
- A fast and simple randomized parallel algorithm for the maximal independent set problem
- Monte-Carlo approximation algorithms for enumeration problems
- Computational Complexity
- Some optimal inapproximability results
This page was built for publication: Approximate inference in Bayesian networks: parameterized complexity results