Pages that link to "Item:Q5157210"
From MaRDI portal
The following pages link to Bias Reduction and Metric Learning for Nearest-Neighbor Estimation of Kullback-Leibler Divergence (Q5157210):
Displaying 10 items.
- Free energy computations by minimization of Kullback-Leibler divergence: An efficient adaptive biasing potential method for sparse representations (Q417933) (← links)
- Bias reduction via linear combination of nearest neighbour entropy estimators (Q622770) (← links)
- Asymptotics for function derivatives estimators based on stationary and ergodic discrete time processes (Q2086282) (← links)
- Model parameter learning using Kullback-Leibler divergence (Q2148645) (← links)
- Uniform convergence rate of the kernel regression estimator adaptive to intrinsic dimension in presence of censored data (Q4988815) (← links)
- Uniform almost sure convergence and asymptotic distribution of the wavelet-based estimators of partial derivatives of multivariate density function under weak dependence (Q5012342) (← links)
- Some results about kernel estimators for function derivatives based on stationary and ergodic continuous time processes with applications (Q5079799) (← links)
- Direct Density Derivative Estimation (Q5380441) (← links)
- Nonparametric Estimation of Küllback-Leibler Divergence (Q5383802) (← links)
- Nonparametric recursive estimation for multivariate derivative functions by stochastic approximation method (Q6133738) (← links)