Deprecated: $wgMWOAuthSharedUserIDs=false is deprecated, set $wgMWOAuthSharedUserIDs=true, $wgMWOAuthSharedUserSource='local' instead [Called from MediaWiki\HookContainer\HookContainer::run in /var/www/html/w/includes/HookContainer/HookContainer.php at line 135] in /var/www/html/w/includes/Debug/MWDebug.php on line 372
Refinements of Pinsker's inequality - MaRDI portal

Refinements of Pinsker's inequality

From MaRDI portal
Publication:4680111

DOI10.1109/TIT.2003.811927zbMath1063.94017OpenAlexW2149306165MaRDI QIDQ4680111

Alexei A. Fedotov, Peter Harremoës, Flemming Topsøe

Publication date: 1 June 2005

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1109/tit.2003.811927




Related Items (27)

Correlation distance and bounds for mutual informationNew Limits to Classical and Quantum Instance CompressionAND-compression of NP-complete problems: streamlined proof and minor observationsMaximum entropy on compact groupsOn the minimum \(f\)-divergence for given total variationDiscretizing Distributions with Exact Moments: Error Estimate and Convergence AnalysisBounds of the Pinsker and Fannes types on the Tsallis relative entropyStatistical learning guarantees for compressive clustering and compressive mixture modelingBerry-Esseen bounds in the entropic central limit theoremA Charlier-Parseval approach to Poisson approximation and its applicationsOptimal upper bounds for the divergence of finite-dimensional distributions under a given variational distanceFano type quantum inequalities in terms of \(q\)-entropiesStatistical and computational limits for sparse matrix detectionA historical perspective on Schützenberger-Pinsker inequalitiesThe entropic Erdős-Kac limit theoremGeneralization of the Kullback-Leibler divergence in the Tsallis statisticsOn the empirical estimation of integral probability metricsSome bounds for skewed \(\alpha\)-Jensen-Shannon divergenceOn divergences of finite measures and their applicability in statistics and information theorySome new maximal inequalitiesOn coupling of probability distributions and estimating the divergence through variationRemainder terms for some quantum entropy inequalitiesNeighborhood radius estimation for variable-neighborhood random fieldsMutual information of several random variables and its estimation via variationLocal limit theorems for densities in Orlicz spacesMutual information, variation, and Fano's inequalityComparison of contraction coefficients for \(f\)-divergences




This page was built for publication: Refinements of Pinsker's inequality