Pages that link to "Item:Q280206"
From MaRDI portal
The following pages link to Information optimality and Bayesian modelling (Q280206):
Displaying 16 items.
- Eliciting vague but proper maximal entropy priors in Bayesian experiments (Q451441) (← links)
- Information and the dispersion of posterior expectations (Q472227) (← links)
- Incorporating prior information when true priors are unknown: an information-theoretic approach for increasing efficiency in estimation (Q498812) (← links)
- A further note on Bayesian information topologies (Q1207014) (← links)
- Using the Bayesian Shtarkov solution for predictions (Q1658740) (← links)
- Asymptotically minimax Bayesian predictive densities for multinomial models (Q1950845) (← links)
- A remark on the maximum entropy principle in uncertainty theory (Q2100463) (← links)
- A frequentist framework of inductive reasoning (Q2392499) (← links)
- Role of information in classical and Bayesian modelling (Q2474694) (← links)
- Generalized information criteria for Bayes decisions (Q2919491) (← links)
- Information-theoretic asymptotics of Bayes methods (Q3492635) (← links)
- Eliciting prior information to enhance the predictive performance of bayesian graphical models (Q4337066) (← links)
- Kullback-leibler information approach to the optimum measurement point for bayesian estimation (Q4337137) (← links)
- Bayesian or Laplacien inference, entropy and information theory and information geometry in data and signal processing (Q5066321) (← links)
- Statistical Problem Classes and Their Links to Information Theory (Q5080449) (← links)
- Interpreting uninterpretable predictors: kernel methods, Shtarkov solutions, and random forests (Q5880100) (← links)