A general minimax result for relative entropy
From MaRDI portal
Publication:4345628
DOI10.1109/18.605594zbMath0878.94038OpenAlexW2112743058MaRDI QIDQ4345628
Publication date: 13 January 1998
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/18.605594
relative entropyKullback-Leibler divergencedensity estimationcomputational learningBayes riskminimax riskchannel capacitysource codingminimax redundancy
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10) Source coding (94A29)
Related Items (14)
Bayesian testing of a point null hypothesis based on the latent information prior ⋮ Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory ⋮ Minoration via mixed volumes and Cover's problem for general channels ⋮ A lower-bound for the maximin redundancy in pattern coding ⋮ Unnamed Item ⋮ Mutual information, metric entropy and cumulative relative entropy risk ⋮ Predicting a binary sequence almost as well as the optimal biased coin ⋮ Game-theoretic probability combination with applications to resolving conflicts between statistical methods ⋮ Some Sufficient Conditions on an Arbitrary Class of Stochastic Processes for the Existence of a Predictor ⋮ Shannon optimal priors on independent identically distributed statistical experiments converge weakly to Jeffrey's prior ⋮ Unnamed Item ⋮ Bayesian predictive densities based on latent information priors ⋮ A scaling law from discrete to continuous solutions of channel capacity problems in the low-noise limit ⋮ Information-theoretic determination of minimax rates of convergence
This page was built for publication: A general minimax result for relative entropy