Entropy and Information Theory

From MaRDI portal
Publication:3058216

DOI10.1007/978-1-4419-7970-4zbMath1216.94001OpenAlexW1496462336MaRDI QIDQ3058216

Robert M. Gray

Publication date: 18 November 2010

Full work available at URL: https://doi.org/10.1007/978-1-4419-7970-4




Related Items (52)

Highly symmetric POVMs and their informational powerImproving quality of sample entropy estimation for continuous distribution probability functionsTransfer mutual information: A new method for measuring information transfer to the interactions of time seriesUpper Bounds on Mixing Time of Finite Markov ChainsTHE STRONG LIMIT THEOREM FOR RELATIVE ENTROPY DENSITY RATES BETWEEN TWO ASYMPTOTICALLY CIRCULAR MARKOV CHAINSThe Shannon’s mutual information of a multiple antenna time and frequency dependent channel: An ergodic operator approachCharacterization of time series via Rényi complexity-entropy curvesAnalyzing local and global properties of multigraphsFrom typical sequences to typical genotypesSubspace learning for unsupervised feature selection via matrix factorizationOne-sided asymptotically mean stationary channelsGlobal and local structure preserving sparse subspace learning: an iterative approach to unsupervised feature selectionA proof of Sanov's theorem via discretizationsTropical limit and a micro-macro correspondence in statistical physicsThe existence of an optimal deterministic contract in moral hazard problemsEntropy rate of product of independent processesStatistical complexity as a criterion for the useful signal detection problemThe Wasserstein distance of order 1 for quantum spin systems on infinite latticesCorrection to: ``Quenched large deviation principle for words in a letter sequenceIndependent Nonlinear Component AnalysisLink prediction based on the mutual information with high-order clustering structure of nodes in complex networksXIRAC-Q: a near-real-time quantum operating system scheduling structure based on Shannon information theoremMeasurements-based constrained control optimization in presence of uncertainties with application to the driver commands for high-speed trainsSubstitution-dynamics and invariant measures for infinite alphabet-path spaceRough set methods in feature selection via submodular functionConfidence intervals and hypothesis testing for the Permutation Entropy with an application to epilepsyAn information theoretic approach to post randomization methods under differential privacyEntropy-based closure for probabilistic learning on manifoldsSome extensions of the operator entropy type inequalitiesKullback-Leibler Approach to CUSUM Quickest Detection Rule for Markovian Time SeriesA sparse regular approximation lemmaNew statistical models of nonergodic cognitive systems and their pathologiesClustered entropy for edge detectionCondensation in stochastic particle systems with stationary product measuresEvent-triggered minimax state estimation with a relative entropy constraintBrownian motion on stationary random manifoldsIn between the \(LQG/H_2\)- and \(H_{\infty } \)-control theoriesExponential forgetting of smoothing distributions for pairwise Markov modelsResolution dependence of the maximal information coefficient for noiseless relationshipLearning to Optimize via Information-Directed SamplingLectures on Entropy. I: Information-Theoretic NotionsFuzzy c-means clustering with conditional probability based K–L information regularizationNonequilibrium in thermodynamic formalism: the second law, gases and information geometryA Szegő type theorem and distribution of symplectic eigenvaluesThe Chaos Game on a General Iterated Function System from a Topological Point of ViewVarentropy of past lifetimesEntropy and dimension of disintegrations of stationary measuresOn typical encodings of multivariate ergodic sourcesA remark on the maximum entropy principle in uncertainty theoryAround the variational principle for metric mean dimensionPoisson-Dirichlet asymptotics in condensing particle systemsOn the entropy of couplings




This page was built for publication: Entropy and Information Theory