Pages that link to "Item:Q5383802"
From MaRDI portal
The following pages link to Nonparametric Estimation of Küllback-Leibler Divergence (Q5383802):
Displaying 17 items.
- Interpreting Kullback--Leibler divergence with the Neyman-Pearson Lemma (Q855917) (← links)
- Performance study of marginal posterior density estimation via Kullback-Leibler divergence (Q1382945) (← links)
- Non-parametric estimation of Kullback-Leibler discrimination information based on censored data (Q2273700) (← links)
- Limit theorems for empirical Rényi entropy and divergence with applications to molecular diversity analysis (Q2397985) (← links)
- A path sampling identity for computing the Kullback-Leibler and J divergences (Q2445627) (← links)
- Alternatives to maximum likelihood estimation based on spacings and the Kullback-Leibler divergence (Q2480029) (← links)
- Estimation of Kullback-Leibler divergence by local likelihood (Q2502140) (← links)
- The Kullback–Leibler Divergence Rate Between Markov Sources (Q3547530) (← links)
- Estimation of KL Divergence: Optimal Minimax Rate (Q4569211) (← links)
- Empirically Estimable Classification Bounds Based on a Nonparametric Divergence Measure (Q4618147) (← links)
- Quantile-based cumulative Kullback–Leibler divergence (Q4639145) (← links)
- Applications of a Kullback-Leibler divergence for comparing non-nested models (Q4970827) (← links)
- Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence (Q5157210) (← links)
- A Characterization of All Single-Integral, Non-Kernel Divergence Estimators (Q5211552) (← links)
- Characterizing variation of nonparametric random probability measures using the Kullback–Leibler divergence (Q5283160) (← links)
- Non parametric estimation of the measure associated with the Lévy–Khintchine canonical representation (Q5860768) (← links)
- Optimal non-asymptotic concentration of centered empirical relative entropy in the high-dimensional regime (Q6165358) (← links)