A generalized divergence for statistical inference
From MaRDI portal
Publication:2405123
DOI10.3150/16-BEJ826zbMath1387.62041MaRDI QIDQ2405123
Ayanendranath Basu, Ian R. Harris, Abhik Ghosh, Avijit Maji, Leandro Pardo
Publication date: 21 September 2017
Published in: Bernoulli (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.bj/1494316831
Asymptotic properties of parametric estimators (62F12) Robustness and adaptive procedures (parametric inference) (62F35)
Related Items (14)
The minimum \(S\)-divergence estimator under continuous models: the Basu-Lindsay approach ⋮ Robust Wald-type test statistics based on minimum C-divergence estimators ⋮ The logarithmic super divergence and asymptotic inference properties ⋮ Testing linear hypotheses in logistic regression analysis with complex sample survey data based on phi-divergence measures ⋮ The extended Bregman divergence and parametric estimation ⋮ Adaptation of the tuning parameter in general Bayesian inference with robust divergence ⋮ Towards a better understanding of the dual representation of phi divergences ⋮ Improvements in the small sample efficiency of the minimum S-divergence estimators under discrete models ⋮ Some Universal Insights on Divergences for Statistics, Machine Learning and Artificial Intelligence ⋮ Asymptotic properties of minimum \(S\)-divergence estimator for discrete models ⋮ Minimum phi-divergence estimators for multinomial logistic regression with complex sample design ⋮ Robust statistical inference based on the \(C\)-divergence family ⋮ On the robustness of a divergence based test of simple statistical hypotheses ⋮ Influence function analysis of the restricted minimum divergence estimators: a general form
This page was built for publication: A generalized divergence for statistical inference