Lectures on Entropy. I: Information-Theoretic Notions
DOI10.1007/978-3-030-13046-6_4zbMath1452.81022OpenAlexW2955566898MaRDI QIDQ4969610
Publication date: 13 October 2020
Published in: Open Quantum Systems (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/978-3-030-13046-6_4
parameter estimationhypothesis testinglaw of large numbersmaximum likelihood estimatorcumulant generating functionBoltzmann-Gibbs-Shannon entropyFisher entropyJensen-Shannon entropy and metricprobability on finite setsRényi's relative entropy
Introductory exposition (textbooks, tutorial papers, etc.) pertaining to probability theory (60-01) Measures of information, entropy (94A17) Information theory (general) (94A15) Statistical aspects of information-theoretic topics (62B10) Open systems, reduced dynamics, master equations, decoherence (81S22) Quantum entropies (81P17)
Related Items (2)
Cites Work
- A Mathematical Theory of Communication
- The epic story of maximum likelihood
- Axiomatic characterizations of information measures
- On measures of information and their characterizations
- A new class of metric divergences on probability spaces and its applicability in statistics
- On the nonequilibrium entropy of large and small systems
- An Introduction to Probability and Statistics
- Entropy and Information Theory
- Statistical Physics and Information Theory
- Entropic fluctuations in statistical mechanics: I. Classical dynamical systems
- Divergence measures based on the Shannon entropy
- Information Theory and Statistical Mechanics
- A large deviations approach to error exponents in source coding and hypothesis testing
- A new metric for probability distributions
- An Extended Cencov Characterization of the Information Metric
- Why the Shannon and Hartley entropies are ‘natural’
- Asymptotic Statistics
- A short characterization of relative entropy
- Fifty years of Shannon theory
- Testing Statistical Hypotheses
- Asymptotically Optimal Tests for Multinomial Distributions
- Über Mittelwerte und Entropien Vollständiger Wahrscheinlichkeitsverteilungen
- On Information and Sufficiency
- A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations
- An invariant form for the prior probability in estimation problems
- Large deviations
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Lectures on Entropy. I: Information-Theoretic Notions