Mean mutual information and symmetry breaking for finite random fields
From MaRDI portal
Publication:424692
DOI10.1214/11-AIHP416zbMath1259.94032MaRDI QIDQ424692
Jérôme Buzzi, Lorenzo Zambotti
Publication date: 4 June 2012
Published in: Annales de l'Institut Henri Poincaré. Probabilités et Statistiques (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.aihp/1334148202
Neural networks for/in biological studies, artificial life and related topics (92B20) Combinatorial probability (60C05) Measures of information, entropy (94A17)
Related Items (3)
Dynamical intricacy and average sample complexity for random bundle transformations ⋮ Dynamical intricacy and average sample complexity of amenable group actions ⋮ Approximate maximizers of intricacy functionals
Cites Work
- Approximate maximizers of intricacy functionals
- Analytical description of the evolution of neural networks: Learning rules and complexity
- Nonnegative entropy measures of multivariate symmetric correlations
- Polymatroidal dependence structure of a set of random variables
- Information Inequalities for Joint Distributions, With Interpretations and Applications
- Random Fragmentation and Coagulation Processes
- Elements of Information Theory
- Unnamed Item
- Unnamed Item
This page was built for publication: Mean mutual information and symmetry breaking for finite random fields