Chernoff distance for conditionally specified models
From MaRDI portal
Publication:1785816
DOI10.1007/s00362-016-0804-5zbMath1397.62384OpenAlexW2463832448MaRDI QIDQ1785816
Publication date: 1 October 2018
Published in: Statistical Papers (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00362-016-0804-5
likelihood ratio orderweighted modelChernoff distanceconditionally specified modelconditional proportional (reversed) hazard rate model
Characterization and structure theory for multivariate probability distributions; copulas (62H05) Statistical aspects of information-theoretic topics (62B10) Reliability and life testing (62N05)
Related Items (2)
On generalized conditional cumulative past inaccuracy measure. ⋮ On generalized conditional cumulative residual inaccuracy measure
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Characterizations of bivariate models using dynamic Kullback-Leibler discrimination measures
- Chernoff distance for truncated distributions
- Multivariate dynamic information
- Some new classes of multivariate survival distribution functions
- A vector multivariate hazard rate
- Conditional specification of statistical models.
- Bivariate generalized cumulative residual entropy
- On dynamic mutual information for bivariate lifetimes
- Dynamic cumulative residual Renyi's entropy
- Bivariate Logistic Distributions
- Proportional Hazards Model for Multivariate Failure Time Data
- On the convexity of some divergence measures based on entropy functions
- On multivariate weighted distributions
- Modeling failure time data by lehman alternatives
- Hellinger distances and α-entropy in a one-parameter class of density functions
- A CHARACTERIZATION OF MODEL APPROACH FOR GENERATING BIVARIATE LIFE DISTRIBUTIONS USING REVERSED HAZARD RATES
- Bivariate extension of (dynamic) cumulative past entropy
- Characterizations of Bivariate Models Using Some Dynamic Conditional Information Divergence Measures
- On Information and Sufficiency
- A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations
This page was built for publication: Chernoff distance for conditionally specified models