On convergence of conditional probability measures
DOI10.2996/kmj/1138037002zbMath0599.60029OpenAlexW1986166708MaRDI QIDQ1080267
Publication date: 1985
Published in: Kodai Mathematical Journal (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.2996/kmj/1138037002
Kullback-Leibler informationtotal variation metricconditional probability measures given sample meansminimum I-divergencesSanov-type large deviation theorems
Large deviations (60F10) Convergence of probability measures (60B10) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10) Limit theorems for vector-valued random variables (infinite-dimensional case) (60B12)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Large deviations of the sample mean in general vector spaces
- Large deviation theorems for empirical probability measures
- A conditional law of large numbers
- I-divergence geometry of probability distributions and minimization problems
- Conditional expectation in an operator algebra. III.
- Conditional expectation in an operator algebra. IV. Entropy and information
- The asymptotic distribution of information per unit cost concerning a linear hypothesis for means of two given normal populations
- On Information and Sufficiency
- A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations
This page was built for publication: On convergence of conditional probability measures