Minimum \(\phi\)-divergence estimators with constraints in multinomial populations
From MaRDI portal
Publication:1600754
DOI10.1016/S0378-3758(01)00113-6zbMath0988.62014OpenAlexW2065405197MaRDI QIDQ1600754
Leandro Pardo, Julio Angel Pardo, Konstantinos G. Zografos
Publication date: 16 June 2002
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/s0378-3758(01)00113-6
multinomial distributionpower divergencenoncentrality parametersminimum phi-divergence estimators with constraints
Asymptotic distribution theory in statistics (62E20) Point estimation (62F10) Parametric inference under constraints (62F30) Statistical aspects of information-theoretic topics (62B10)
Related Items
Statistical inference for multinomial populations based on a double index family of test statistics ⋮ Phi-divergences and polytomous logistic regression models: An overview ⋮ On tests of symmetry, marginal homogeneity and quasi-symmetry in two-way contingency tables based on minimum φ-divergence estimator with constraints ⋮ Poisson loglinear modeling with linear constraints on the expected cell frequencies ⋮ An approach to multiway contingency tables based on \(\phi \)-divergence test statistics ⋮ Analysis of \(\varPhi\)-divergence for loglinear models with constraints under product-multinomial sampling ⋮ Analysis of divergence in loglinear models when expected frequencies are subject to linear constraints ⋮ Conditional tests of marginal homogeneity based on \(\phi\)-divergence test statistics ⋮ Minimum phi-divergence estimators for loglinear models with linear constraints and multinomial sampling ⋮ On tests of homogeneity based on minimum \(\varphi \)-divergence estimator with constraints ⋮ Informative barycentres in statistics ⋮ Phi-Divergence Statistics for Testing Linear Hypotheses in Logistic Regression Models ⋮ On tests of independence based on minimum \(\varphi \)-divergence estimator with constraints: An application to modeling DNA ⋮ An extension of likelihood-ratio-test for testing linear hypotheses in the baseline-category logit model ⋮ Unnamed Item
Cites Work
- Maximum likelihood methods for linear and log-linear models in categorical data
- Goodness-of-fit statistics for discrete multivariate data
- Minimum Hellinger distance estimates for parametric models
- Efficiency versus robustness: The case for minimum Hellinger distance and related methods
- Asymptotic divergence of estimates of discrete distributions
- Divergence statistics: sampling properties and multinomial goodness of fit and divergence tests
- The Lagrangian Multiplier Test
- Maximum-Likelihood Estimation of Parameters Subject to Restraints
- Maximum Likelihood Methods for Log-Linear Models When Expected Frequencies are Subjected to Linear Constraints
- A New Proof of the Pearson-Fisher Theorem
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item