On Two Forms of Fisher's Measure of Information
From MaRDI portal
Publication:5314578
DOI10.1081/STA-200063386zbMath1073.62003MaRDI QIDQ5314578
Kosmas Ferentinos, Takis Papaioannou
Publication date: 5 September 2005
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
additivityconvolutionssuperadditivitylimit theoryCramer-Rao inequalityFisher information numbermaximal informationobserved and expected Fisher informationshift-invariant Fisher information
Related Items
Bayesian information in an experiment and the Fisher information distance ⋮ On the time-dependent Fisher information of a density function ⋮ Some information theoretic ideas useful in statistical inference ⋮ Projection pursuit via white noise matrices ⋮ On reconsidering entropies and divergences and their cumulative counterparts: Csiszár's, DPD's and Fisher's type cumulative and survival measures ⋮ ℒ1-Limit of Trimmed Sums of Order Statistics from Location-Scale Distributions with Applications to Type II Censored Data Analysis ⋮ Covariate Information Matrix for Sufficient Dimension Reduction ⋮ Divergences without probability vectors and their applications
Cites Work
- Unnamed Item
- Unnamed Item
- An extension of the information inequality and related characterizations
- Superadditivity of Fisher's information and logarithmic Sobolev inequalities
- An application of the convolution inequality for the Fisher information
- On the entropic measures of stochastic dependence
- Some characteristic properties of the Fisher information matrix via Cacoullos-type inequalities
- Statistical meaning of Carlen's superadditivity of the Fisher information
- Measures of multivariate dependence based on a distance between Fisher information matrices
- Relation between the covariance and Fisher information matrices
- Limiting properties of some measures of information
- New parametric measures of information
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- Assessing the accuracy of the maximum likelihood estimator: Observed versus expected Fisher information
- On a measure of dependence based on fisher's information matrix
- Applications to Optics and Wave Mechanics of the Criterion of Maximum Cramer-Rao Bound
- On an Analogue of Bhattacharya Bound
- Identification in Parametric Models
- A discrete version of the Stam inequality and a characterization of the Poisson distribution
This page was built for publication: On Two Forms of Fisher's Measure of Information