Freedman’s Paradox: A Solution Based on Normalized Entropy
From MaRDI portal
Publication:5048357
DOI10.1007/978-3-030-56219-9_16OpenAlexW3108766975MaRDI QIDQ5048357
Publication date: 16 November 2022
Published in: Contributions to Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/978-3-030-56219-9_16
Cites Work
- Unnamed Item
- A Mathematical Theory of Communication
- The data-constrained generalized maximum entropy estimator of the GLM: asymptotic theory and inference
- Normalized entropy aggregation for inhomogeneous large-scale data
- Information Theory and Statistical Mechanics
- Capturing the Intangible Concept of Information
- Foundations of Info-Metrics
- Ridge Regression and Generalized Maximum Entropy: An improved version of the Ridge–GME parameter estimator
- ValidP-Values Behave Exactly as They Should: Some Misleading Criticisms ofP-Values and Their Resolution WithS-Values
- Coup de Grâce for a Tough Old Bull: “Statistically Significant” Expires
- The ASA Statement on p-Values: Context, Process, and Purpose
- A simultaneous estimation and variable selection rule
This page was built for publication: Freedman’s Paradox: A Solution Based on Normalized Entropy