Deprecated: $wgMWOAuthSharedUserIDs=false is deprecated, set $wgMWOAuthSharedUserIDs=true, $wgMWOAuthSharedUserSource='local' instead [Called from MediaWiki\HookContainer\HookContainer::run in /var/www/html/w/includes/HookContainer/HookContainer.php at line 135] in /var/www/html/w/includes/Debug/MWDebug.php on line 372
Minimum mutual information and non-Gaussianity through the maximum entropy method: estimation from finite samples - MaRDI portal

Deprecated: Use of MediaWiki\Skin\SkinTemplate::injectLegacyMenusIntoPersonalTools was deprecated in Please make sure Skin option menus contains `user-menu` (and possibly `notifications`, `user-interface-preferences`, `user-page`) 1.46. [Called from MediaWiki\Skin\SkinTemplate::getPortletsTemplateData in /var/www/html/w/includes/Skin/SkinTemplate.php at line 691] in /var/www/html/w/includes/Debug/MWDebug.php on line 372

Deprecated: Use of QuickTemplate::(get/html/text/haveData) with parameter `personal_urls` was deprecated in MediaWiki Use content_navigation instead. [Called from MediaWiki\Skin\QuickTemplate::get in /var/www/html/w/includes/Skin/QuickTemplate.php at line 131] in /var/www/html/w/includes/Debug/MWDebug.php on line 372

Minimum mutual information and non-Gaussianity through the maximum entropy method: estimation from finite samples (Q742670)

From MaRDI portal





scientific article; zbMATH DE number 6346088
Language Label Description Also known as
English
Minimum mutual information and non-Gaussianity through the maximum entropy method: estimation from finite samples
scientific article; zbMATH DE number 6346088

    Statements

    Minimum mutual information and non-Gaussianity through the maximum entropy method: estimation from finite samples (English)
    0 references
    0 references
    0 references
    0 references
    19 September 2014
    0 references
    Summary: The Minimum Mutual Information (MinMI) Principle provides the least committed, maximum-joint-entropy (ME) inferential law that is compatible with prescribed marginal distributions and empirical cross constraints. Here, we estimate MI bounds (the MinMI values) generated by constraining sets \(\mathbf T_{cr}\) comprehended by \(m_{cr}\) linear and/or nonlinear joint expectations, computed from samples of \(N\) iid outcomes. Marginals (and their entropy) are imposed by single morphisms of the original random variables. \(N\)-asymptotic formulas are given both for the distribution of cross expectation's estimation errors, the MinMI estimation bias, its variance and distribution. A growing \(\mathbf T_{cr}\) leads to an increasing MinMI, converging eventually to the total MI. Under \(N\)-sized samples, the MinMI increment relative to two encapsulated sets \(\mathbf T_{cr1}\subset\mathbf T_{cr2}\) (with numbers of constraints \(m_{cr1}<m_{cr2}\)) is the test-difference \(\delta H=H_{\max 1,N}-H_{\max 2,N}\geq 0\) between the two respective estimated MEs. Asymptotically, \(\delta H\) follows a Chi-Squared distribution \(\frac{1}{2N}\chi^2_{(m_{cr2}-m_{cr1})}\) whose upper quantiles determine if constraints in \(\mathbf T_{cr2}/\mathbf T_{cr1}\) explain significant extra MI. As an example, we have set marginals to being normally distributed (Gaussian) and have built a sequence of MI bounds, associated to successive non-linear correlations due to joint non-Gaussianity. Noting that in real-world situations available sample sizes can be rather low, the relationship between MinMI bias, probability density over-fitting and outliers is put in evidence for under-sampled data.
    0 references
    mutual information
    0 references
    non-Gaussianity
    0 references
    maximum entropy distributions
    0 references
    entropy bias
    0 references
    mutual information distribution
    0 references
    morphism
    0 references

    Identifiers