Costa’s concavity inequality for dependent variables based on the multivariate Gaussian copula
From MaRDI portal
Publication:6189092
DOI10.1017/jpr.2022.128OpenAlexW4365146203MaRDI QIDQ6189092
Mohammad Hossein Alamatsaz, Fatemeh Asgari
Publication date: 12 January 2024
Published in: Journal of Applied Probability (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1017/jpr.2022.128
Applications of statistics in engineering and industry; control charts (62P30) Measures of information, entropy (94A17) Information theory (general) (94A15)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Extension of de Bruijn's identity to dependent non-Gaussian noise channels
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- A Conditional Entropy Power Inequality for Dependent Variables
- The Capacity Region of the Gaussian Multiple-Input Multiple-Output Broadcast Channel
- A new entropy power inequality
- A simple converse for broadcast channels with additive white Gaussian noise (Corresp.)
- A short proof of the "concavity of entropy power"
- INEQUALITIES FOR THE DEPENDENT GAUSSIAN NOISE CHANNELS BASED ON FISHER INFORMATION AND COPULAS
- An extension of entropy power inequality for dependent random variables
- MULTIVARIATE DISPERSION ORDER AND THE NOTION OF COPULA APPLIED TO THE MULTIVARIATE t-DISTRIBUTION
- The convolution inequality for entropy powers
This page was built for publication: Costa’s concavity inequality for dependent variables based on the multivariate Gaussian copula