Towards Lifted Inference Under Maximum Entropy for Probabilistic Relational FO-PCL Knowledge Bases
From MaRDI portal
Publication:3451211
DOI10.1007/978-3-319-20807-7_46zbMath1465.68234OpenAlexW2399123559MaRDI QIDQ3451211
Christoph Beierle, Marc Finthammer, Josef Baudisch, Nico Potyka
Publication date: 10 November 2015
Published in: Lecture Notes in Computer Science (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/978-3-319-20807-7_46
Logic in artificial intelligence (68T27) Probability and inductive logic (03B48) Knowledge representation (68T30) Reasoning under uncertainty in the context of artificial intelligence (68T37)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Combining probabilistic logic programming with the power of maximum entropy
- Probabilistic logic
- Achieving parametric uniformity for knowledge bases in a relational probabilistic conditional logic with maximum entropy semantics
- How to Exploit Parametric Uniformity for Maximum Entropy Reasoning in a Relational Probabilistic Logic
- First-order probabilistic conditional logic and maximum entropy
- Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
- Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy
- Reasoning about knowledge and probability
- The Uncertain Reasoner's Companion
- Conditionals in nonmonotonic reasoning and belief revision. Considering conditionals as agents
This page was built for publication: Towards Lifted Inference Under Maximum Entropy for Probabilistic Relational FO-PCL Knowledge Bases