Cumulative and relative cumulative residual information generating measures and associated properties
From MaRDI portal
Publication:6170101
DOI10.1080/03610926.2021.2005100OpenAlexW3217807237MaRDI QIDQ6170101
Omid Kharazmi, Narayanaswamy Balakrishnan
Publication date: 12 July 2023
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610926.2021.2005100
survival functionKullback-Leibler divergenceinformation generating functionJensen-Shannon entropyJensen-Gini mean differenceJensen-information generating function
Cites Work
- Unnamed Item
- A Mathematical Theory of Communication
- Stochastic orders
- The relative information generating function
- Entropy of type \((\alpha,\beta)\) and other generalized measures in information theory
- A Jensen-Gini measure of divergence with application in parameter estimation
- Non-parametric inference for Gini covariance and its variants
- Fractional cumulative residual entropy
- On the dynamic cumulative residual entropy
- Jensen-information generating function and its connections to some well-known information measures
- Testing Goodness-of-Fit for Exponential Distribution Based on Cumulative Residual Entropy
- Cumulative Residual Entropy: A New Measure of Information
- Connections of Gini, Fisher, and Shannon by Bayes risk under proportional hazards
- Cumulative Residual and Relative Cumulative Residual Fisher Information and Their Properties
- On Information and Sufficiency
This page was built for publication: Cumulative and relative cumulative residual information generating measures and associated properties