A nonparametric two‐sample test using a general φ‐divergence‐based mutual information
From MaRDI portal
Publication:6067721
DOI10.1111/stan.12232OpenAlexW3112665147MaRDI QIDQ6067721
Atanu Biswas, Abhik Ghosh, Apratim Guha
Publication date: 14 December 2023
Published in: Statistica Neerlandica (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1111/stan.12232
kernel density estimationpower divergenceinformation leakagenonparametric testtest of independence\(\varphi\)-divergence
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Assessing the dependence structure of the components of hybrid time series processes using mutual information
- Minimum Hellinger distance estimates for parametric models
- Empirical phi-divergence test statistics for the difference of means of two populations
- Empirical likelihood inference for two-sample problems
- Two-sample empirical likelihood method
- Empirical likelihood for the two-sample mean problem
- Measuring stochastic dependence using \(\phi\)-divergence
- Auto-association measures for stationary time series of categorical data
- Convergence properties of functional estimates for discrete distributions
- Does My Device Leak Information? An a priori Statistical Power Analysis of Leakage Detection Tests
- Empirical phi-divergence test statistics for testing simple and composite null hypotheses
- Statistical Measurement of Information Leakage
- Nonparametric Entropy-Based Tests of Independence Between Stochastic Processes
- A Nonparametric Test for the General Two-Sample Problem
- LIKELIHOOD DIVERGENCE STATISTICS FOR TESTING HYPOTHESES ABOUT MULTIPLE POPULATION
- Estimation of Entropy and Mutual Information
- Testing Statistical Hypotheses
- Empirical likelihood tests for two-sample problems via nonparametric density estimation
- Elements of Information Theory
- Statistical Inference
- Robust Statistics
This page was built for publication: A nonparametric two‐sample test using a general φ‐divergence‐based mutual information