Mutual information, metric entropy and cumulative relative entropy risk
From MaRDI portal
Publication:1383090
DOI10.1214/aos/1030741081zbMath0920.62007OpenAlexW1990645294MaRDI QIDQ1383090
Publication date: 21 September 1999
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1214/aos/1030741081
Minimax procedures in statistical decision theory (62C20) Information theory (general) (94A15) Statistical aspects of information-theoretic topics (62B10)
Related Items
Optimal quantization of the support of a continuous multivariate distribution based on mutual information, Simultaneous prediction of independent Poisson observables, Convergence rates of deep ReLU networks for multiclass classification, Bayesian parametric inference in a nonparametric framework, Unnamed Item, Loss of information of a statistic for a family of non-regular distributions. II: More general case, Information aware max-norm Dirichlet networks for predictive uncertainty estimation, Unnamed Item, Hölder's identity, Catching up Faster by Switching Sooner: A Predictive Approach to Adaptive Estimation with an Application to the AIC–BIC Dilemma, Adaptive Design Optimization: A Mutual Information-Based Approach to Model Discrimination in Cognitive Science, Quantifying Information Conveyed by Large Neuronal Populations, Conjugate Priors Represent Strong Pre-Experimental Assumptions, Predictability, Complexity, and Learning, Prequential analysis of complex data with adaptive model reselection, Information-theoretic determination of minimax rates of convergence, Statistical Decision Problems and Bayesian Nonparametric Methods, Entropy-SGD: biasing gradient descent into wide valleys, Mixing strategies for density estimation., Improved lower bounds for learning from noisy examples: An information-theoretic approach
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Some limit theorems for empirical processes (with discussion)
- On density estimation in the view of Kolmogorov's ideas in approximation theory
- On the consistency of Bayes estimates
- Stochastic complexity and modeling
- Asymptotic methods in statistical decision theory
- The strong ergodic theorem for densities: Generalized Shannon-McMillan- Breiman theorem
- Information contained in a sequence of observations
- Bounds on the sample complexity of Bayesian learning using information theory and the VC dimension
- Rates of convergence for minimum contrast estimators
- Jeffreys' prior is asymptotically least favorable under entropy risk
- Differential geometry of curved exponential families. Curvatures and information loss
- A loss bound model for on-line stochastic prediction algorithms
- Probability inequalities for likelihood ratios and convergence rates of sieve MLEs
- On the stochastic complexity of learning realizable and unrealizable rules
- On estimating a density using Hellinger distance and some other strange facts
- Entropies of several sets of real valued functions
- Transformations of Wiener integrals under translations
- Recent Developments in Nonparametric Density Estimation
- Information-theoretic asymptotics of Bayes methods
- A bound on the financial value of information
- A source matching approach to finding minimax codes
- Density estimation by stochastic complexity
- Distribution estimation consistent in total variation and in two types of information divergence
- There is no universal source code for an infinite source alphabet
- A general minimax result for relative entropy
- [https://portal.mardi4nfdi.de/wiki/Publication:4743580 Approximation dans les espaces m�triques et th�orie de l'estimation]
- A strong version of the redundancy-capacity theorem of universal coding
- Lower bounds on expected redundancy for nonparametric classes