Jensen-information generating function and its connections to some well-known information measures
From MaRDI portal
Publication:2657979
DOI10.1016/j.spl.2020.108995zbMath1460.94028OpenAlexW3106581734MaRDI QIDQ2657979
Omid Kharazmi, Narayanaswamy Balakrishnan
Publication date: 18 March 2021
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.spl.2020.108995
Kullback-Leibler divergenceShannon entropyinformation generating functionJensen extropyJensen-Shannon entropy
Probability distributions: general theory (60E05) Measures of information, entropy (94A17) Information theory (general) (94A15)
Related Items (8)
Generating function for generalized Fisher information measure and its application to finite mixture models ⋮ Information generating function of \(k\)-record values and its applications ⋮ On reconsidering entropies and divergences and their cumulative counterparts: Csiszár's, DPD's and Fisher's type cumulative and survival measures ⋮ Cumulative and relative cumulative residual information generating measures and associated properties ⋮ Weighted fractional generalized cumulative past entropy and its properties ⋮ A novel method to generating two-sided class of probability distributions ⋮ Information generating function of record values ⋮ Optimal information, Jensen-RIG function and \(\alpha\)-Onicescu's correlation coefficient in terms of information generating functions
Cites Work
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Entropies based on fractional calculus
- Stochastic orders
- The relative information generating function
- Entropy of type \((\alpha,\beta)\) and other generalized measures in information theory
- On measures of information energy
- On the time-dependent Fisher information of a density function
- A Jensen-Gini measure of divergence with application in parameter estimation
- Some properties of generalized Fisher information in the context of nonextensive thermostatistics
- Quantile-based reliability analysis
- Fractional cumulative residual entropy
- On the dynamic cumulative residual entropy
- On the Equivalence Between Stein and De Bruijn Identities
- Divergence measures based on the Shannon entropy
- Energie informationnelle et notions apparentees
- Connections of Gini, Fisher, and Shannon by Bayes risk under proportional hazards
- Local Entropy Statistics for Point Processes
- Mixture Models, Bayes Fisher Information, and Divergence Measures
- On Information and Sufficiency
This page was built for publication: Jensen-information generating function and its connections to some well-known information measures