Nonparametric estimation of information-based measures of statistical dispersion (Q406103)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Nonparametric estimation of information-based measures of statistical dispersion |
scientific article; zbMATH DE number 6341014
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | Nonparametric estimation of information-based measures of statistical dispersion |
scientific article; zbMATH DE number 6341014 |
Statements
Nonparametric estimation of information-based measures of statistical dispersion (English)
0 references
8 September 2014
0 references
Summary: We address the problem of nonparametric estimation of the recently proposed measures of statistical dispersion of positive continuous random variables. The measures are based on the concepts of differential entropy and Fisher information and describe the ``spread'' or ``variability'' of the random variable from a different point of view than the ubiquitously used concept of standard deviation. The maximum penalized likelihood estimation of the probability density function proposed by Good and Gaskins is applied and a complete methodology of how to estimate the dispersion measures with a single algorithm is presented. We illustrate the approach on three standard statistical models describing neuronal activity.
0 references
statistical dispersion
0 references
entropy
0 references
Fisher information
0 references
nonparametric density estimation
0 references
0 references