Kullback-Leibler divergence based multidimensional robust universal hypothesis testing
From MaRDI portal
Publication:6657830
DOI10.1007/s11222-024-10533-2MaRDI QIDQ6657830
Publication date: 7 January 2025
Published in: Statistics and Computing (Search for Journal in Brave)
Computational methods for problems pertaining to statistics (62-08) Nonparametric hypothesis testing (62G10) Hypothesis testing in multivariate analysis (62H15)
Cites Work
- Title not available (Why is that?)
- Conic optimization via operator splitting and homogeneous self-dual embedding
- Distances on probability measures and random variables
- Generalized Cramér-von Mises goodness-of-fit tests for multivariate distributions
- Product of \(n\) independent uniform random variables
- Goodness-of-fit testing for copulas: a distribution-free approach
- CVXPY: a Python-embedded modeling language for convex optimization
- Asymptotically Optimal One- and Two-Sample Testing With Kernels
- Robust Kullback-Leibler Divergence and Universal Hypothesis Testing for Continuous Distributions
- Distances of Probability Measures and Random Variables
- Calcul des probabilités.
- New bounds for the empirical robust Kullback-Leibler divergence problem
This page was built for publication: Kullback-Leibler divergence based multidimensional robust universal hypothesis testing
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6657830)