The Hausdorff metric of \(\sigma\)-fields and the value of information
From MaRDI portal
Publication:2365741
DOI10.1214/aop/1176989398zbMath0777.62007OpenAlexW2091268372MaRDI QIDQ2365741
Publication date: 29 June 1993
Published in: The Annals of Probability (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aop/1176989398
inequalitiesHausdorff metricsymmetric differencesets of measurable functionscontinuity of the value of informationmetrization of stochastic convergencesub-sigma fields
Probabilistic measure theory (60A10) Statistical aspects of information-theoretic topics (62B10) Sufficiency and information (62B99)
Related Items (6)
Cognitive limits and preferences for information ⋮ A couple of remarks on the convergence of \(\sigma\)-fields on probability spaces ⋮ Distances between \(\sigma \)-fields on a probability space ⋮ Filtrations for which all \({\mathcal H}^2\) martingales are of integrable variation; distances between \(\sigma\)-algebras ⋮ A theorem of the maximin and applications to Bayesian zero-sum games ⋮ Information, measurability, and continuous behavior.
This page was built for publication: The Hausdorff metric of \(\sigma\)-fields and the value of information