New bounds for the empirical robust Kullback-Leibler divergence problem
From MaRDI portal
Publication:6124700
DOI10.1016/j.ins.2023.118972OpenAlexW4366503330MaRDI QIDQ6124700
Publication date: 28 March 2024
Published in: Information Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.ins.2023.118972
Nonparametric hypothesis testing (62G10) Convex programming (90C25) Statistical aspects of information-theoretic topics (62B10)
Cites Work
- Unnamed Item
- Conic optimization via operator splitting and homogeneous self-dual embedding
- A test for population collinearity. A Kullback-Leibler information approach
- Randomized mixture models for probability density approximation and estimation
- On the tight constant in the multivariate Dvoretzky-Kiefer-Wolfowitz inequality
- CVXPY: A Python-Embedded Modeling Language for Convex Optimization
- Joint state and parameter robust estimation of stochastic nonlinear systems
- On universal hypotheses testing via large deviations
- Random coding strategies for minimum entropy
- On Choosing and Bounding Probability Metrics
- Asymptotically Optimal One- and Two-Sample Testing With Kernels
- Robust Kullback-Leibler Divergence and Universal Hypothesis Testing for Continuous Distributions
- Asymptotically Optimal Tests for Multinomial Distributions
This page was built for publication: New bounds for the empirical robust Kullback-Leibler divergence problem