Bayesian and frequentist inference derived from the maximum entropy principle with applications to propagating uncertainty about statistical methods
From MaRDI portal
Publication:6640144
DOI10.1007/s00362-024-01597-3MaRDI QIDQ6640144
Publication date: 18 November 2024
Published in: Statistical Papers (Search for Journal in Brave)
uncertainty quantificationfoundations of statisticsfiducial inferencetheory of errors\(p\)-hackingreplication crisiscertainty distributionapproximate confidence distributiondistribution of certaintyordered principle of maximum entropy
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A prior-free framework of coherent inference and its derivation of simple shrinkage estimators
- Fiducial prediction intervals
- Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems
- Application of the maximum relative entropy method to the physics of ferromagnetic materials
- Conditional fiducial models
- Self-consistent confidence sets and tests of composite hypotheses applicable to restricted parameters
- Combining information from independent sources through confidence distributions
- Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory
- The ASA President's task force statement on statistical significance and replicability
- The Dempster-Shafer calculus for statisticians
- Differential entropy and dynamics of uncertainty
- The comparison of samples with possibly unequal variances.
- Rules of proof for maximal entropy inference
- Using confidence distribution sampling to visualize confidence sets
- The Highest Confidence Density Region and Its Usage for Joint Inferences about Constrained Parameters
- Information Theory and Statistical Mechanics
- Confidence, Likelihood, Probability
- In Defence of Objective Bayesianism
- Confidence and Likelihood*
- Probability Theory
- Inferential Models: A Framework for Prior-Free Posterior Probabilistic Inference
- Publication Policies for Replicable Research and the Community-Wide False Discovery Rate
- Null Hypothesis Significance Testing Interpreted and Calibrated by Estimating Probabilities of Sign Errors: A Bayes-Frequentist Continuum
- Confidence intervals, significance values, maximum likelihood estimates, etc. sharpened into Occam’s razors
- Confidence distributions and empirical Bayes posterior distributions unified as distributions of evidential support
- A Gaussian alternative to using improper confidence intervals
- The Geometry of Uncertainty
- A note on fiducial model averaging as an alternative to checking Bayesian and frequentist models
- Calibration ofρValues for Testing Precise Null Hypotheses
- Elements of Information Theory
- Fiducial Generalized Confidence Intervals
- Moving to a World Beyond “p < 0.05”
- The ASA Statement on p-Values: Context, Process, and Purpose
- Maximum entropy derived and generalized under idempotent probability to address Bayes-frequentist uncertainty and model revision uncertainty: an information-theoretic semantics for possibility theory
- Likelihood, Replicability and Robbins' Confidence Sequences
This page was built for publication: Bayesian and frequentist inference derived from the maximum entropy principle with applications to propagating uncertainty about statistical methods