Moderating probability distributions for unrepresented uncertainty: Application to sentiment analysis via deep learning
From MaRDI portal
Publication:5867742
DOI10.1080/03610926.2020.1863988OpenAlexW3119902693MaRDI QIDQ5867742
Publication date: 14 September 2022
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610926.2020.1863988
maximum entropybig datadata sciencedeep learningdeep neural networkdiscounting probability distributionsunknown loss function
Uses Software
Cites Work
- Axiomatic characterizations of information measures
- Inference after checking multiple Bayesian models for data conflict and applications to mitigating the influence of rejected priors
- Decision making under uncertainty using imprecise probabilities
- Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems
- Departing from Bayesian inference toward minimaxity to the extent that the posterior distribution is unreliable
- Size, power and false discovery rates
- Probability Theory
- Confidence distributions applied to propagating uncertainty to inference based on estimating the local false discovery rate: A fiducial continuum from confidence sets to empirical Bayes set estimates as the number of comparisons increases
- Reporting Bayes factors or probabilities to decision makers of unknown loss functions
- Confidence distributions and empirical Bayes posterior distributions unified as distributions of evidential support
- Introduction to Imprecise Probabilities
This page was built for publication: Moderating probability distributions for unrepresented uncertainty: Application to sentiment analysis via deep learning