A test for independence via Bayesian nonparametric estimation of mutual information
From MaRDI portal
Publication:6059410
DOI10.1002/cjs.11645arXiv2002.03490OpenAlexW3193974515MaRDI QIDQ6059410
Zahra Saberi, Luai Al-Labadi, Forough Fazeli Asl
Publication date: 2 November 2023
Published in: Canadian Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2002.03490
mutual informationDirichlet processtest for independencerelative belief inferences\(k\)-nearest neighbour distance
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Scalable Bayesian nonparametric measures for exploring pairwise dependence via Dirichlet process mixtures
- On convergence properties of Shannon entropy
- A smoothed bootstrap test for independence based on mutual information
- Sample estimate of the entropy of a random vector
- Two measures of sample entropy
- Goodness of fit for the logistic regression model using relative belief
- A Bayesian nonparametric approach to testing for dependence between random variables
- Efficient multivariate entropy estimation via \(k\)-nearest neighbour distances
- A new measure of entropy of continuous random variable
- A Bayesian analysis of some nonparametric problems
- On the entropy estimators
- Exact and approximate sum representations for the Dirichlet process
- Nonparametric goodness-of-fit
- Prior‐based model checking
- Some new results on information properties of mixture distributions
- On one-sample Bayesian tests for the mean
- Goodness-of-fit tests based on the distance between the Dirichlet process and its base measure
- Elements of Information Theory
- Nonparametric independence testing via mutual information
This page was built for publication: A test for independence via Bayesian nonparametric estimation of mutual information