Unique information via dependency constraints
DOI10.1088/1751-8121/aaed53zbMath1422.94006arXiv1709.06653OpenAlexW2963492916WikidataQ129073853 ScholiaQ129073853MaRDI QIDQ5235162
Jeffrey Emenheiser, Ryan G. James, James P. Crutchfield
Publication date: 7 October 2019
Published in: Journal of Physics A: Mathematical and Theoretical (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1709.06653
mutual informationinformation theorystatistical dependencecyberneticspartial information decomposition
Information theory (general) (94A15) Statistical aspects of information-theoretic topics (62B10) Sampling theory in information and communication theory (94A20) Communication theory (94A05) Coding theorems (Shannon theory) (94A24)
Related Items (4)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Papers on probability, statistics and statistical physics. Ed. by R. D. Rosenkrantz.
- Causation entropy identifies indirect influences, dominance of neighbors and anticipatory couplings
- An overview of reconstructability analysis
- Ross Ashby's information theory: a bit of history, some solutions to problems, and what we face today
- Information geometry on hierarchy of probability distributions
- Unconditionally secure key agreement and the intrinsic conditional information
- Shannon Entropy and Mutual Information for Multivariate Skew‐Elliptical Distributions
- Elements of Information Theory
- The lattice theory of information
This page was built for publication: Unique information via dependency constraints