Estimation, prediction and the Stein phenomenon under divergence loss
From MaRDI portal
Publication:953855
DOI10.1016/j.jmva.2008.02.002zbMath1274.62080OpenAlexW1963870319MaRDI QIDQ953855
Malay Ghosh, Gauri Sankar Datta, Victor Mergel
Publication date: 6 November 2008
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jmva.2008.02.002
empirical BayesminimaxityadmissibilityKullback-Leibler lossBaranchick classBhattacharyya-Hellinger loss
Minimax procedures in statistical decision theory (62C20) Empirical decision procedures; empirical Bayes procedures (62C12) Admissibility in statistical decision theory (62C15)
Related Items
On minimax optimality of sparse Bayes predictive density estimates ⋮ Reference priors via \(\alpha \)-divergence for a certain non-regular model in the presence of a nuisance parameter ⋮ From minimax shrinkage estimation to minimax shrinkage prediction ⋮ On generalized moment identity and its applications: a unified approach ⋮ Matrix variate density estimation with additional information ⋮ Some Variants of Constrained Estimation in Finite Population Sampling ⋮ Optimal shrinkage estimation of predictive densities under \(\alpha\)-divergences ⋮ Density prediction and the Stein phenomenon ⋮ On predictive density estimation with additional information ⋮ On discrete priors and sparse minimax optimal predictive densities ⋮ Minimax Estimation of the Mean Matrix of the Matrix Variate Normal Distribution under the Divergence Loss Function ⋮ Hierarchical empirical Bayes estimation of two sample means under divergence loss ⋮ On the Stein phenomenon under divergence loss and an unknown variance-covariance matrix ⋮ On predictive density estimation under \(\alpha\)-divergence loss ⋮ On the Loss Robustness of Least-Square Estimators ⋮ Exact minimax estimation of the predictive density in sparse Gaussian models
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Deriving posterior distributions for a location parameter: A decision theoretic approach
- Minimax estimation of location parameters for spherically symmetric distributions with concave loss
- A statistical diptych: Admissible inferences -- recurrence of symmetric Markov chains
- Minimax Bayes estimators of a multivariate normal mean
- Differential geometry of curved exponential families. Curvatures and information loss
- Admissiblity of procedures in two-dimensional location parameter problems
- Inadmissibility of maximum likelihood estimators in some multiple regression problems with three or more independent variables
- Improved minimax predictive densities under Kullback-Leibler loss
- A shrinkage predictive distribution for multivariate Normal observables
- The Admissibility of Pitman's Estimator of a Single Location Parameter
- Goodness of prediction fit
- A Generalized Bayes Rule for Prediction
- A Family of Minimax Estimators of the Mean of a Multivariate Normal Distribution
- Stein's Estimation Rule and Its Competitors--An Empirical Bayes Approach
- On the Admissibility of Invariant Estimators of One or More Location Parameters
- Proper Bayes Minimax Estimators of the Multivariate Normal Mean
- Admissible Estimators, Recurrent Diffusions, and Insoluble Boundary Value Problems
- Some Problems in Minimax Point Estimation
- On Minimax Statistical Decision Procedures and their Admissibility