An extension of entropy power inequality for dependent random variables
From MaRDI portal
Publication:5092675
DOI10.1080/03610926.2020.1813305OpenAlexW3082700798MaRDI QIDQ5092675
Fatemeh Asgari, Mohammad Hossein Alamatsaz
Publication date: 22 July 2022
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610926.2020.1813305
Related Items (2)
Costa’s concavity inequality for dependent variables based on the multivariate Gaussian copula ⋮ Differential entropy of induced random state ensemble
Cites Work
- Unnamed Item
- A Mathematical Theory of Communication
- Entropy and the central limit theorem
- Fisher information inequalities and the central limit theorem
- On the Equivalence Between Stein and De Bruijn Identities
- Extension of de Bruijn's identity to dependent non-Gaussian noise channels
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- A Conditional Entropy Power Inequality for Dependent Variables
- A simple proof of the entropy-power inequality
- The Capacity Region of the Gaussian Multiple-Input Multiple-Output Broadcast Channel
- An Extremal Inequality Motivated by Multiterminal Information-Theoretic Problems
- Generalized Entropy Power Inequalities and Monotonicity Properties of Information
- A new entropy power inequality
- A simple converse for broadcast channels with additive white Gaussian noise (Corresp.)
- Variants of the Entropy Power Inequality
- INEQUALITIES FOR THE DEPENDENT GAUSSIAN NOISE CHANNELS BASED ON FISHER INFORMATION AND COPULAS
- Information Theoretic Proofs of Entropy Power Inequalities
- The convolution inequality for entropy powers
- Yet Another Proof of the Entropy Power Inequality
This page was built for publication: An extension of entropy power inequality for dependent random variables