Density estimation with minimization of \(U\)-divergence
From MaRDI portal
Publication:1945015
DOI10.1007/s10994-012-5298-3zbMath1260.68340OpenAlexW2033320332MaRDI QIDQ1945015
Publication date: 28 March 2013
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-012-5298-3
Estimation in multivariate analysis (62H12) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (2)
Kernel density estimation by genetic algorithm ⋮ Regression with stagewise minimization on risk function
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Empirical risk minimization in inverse problems
- Robust parameter estimation with a small bias against heavy contamination
- Selective sampling using the query by committee algorithm
- Convergence rates for unconstrained bandwidth matrix selectors in multivariate kernel density estimation
- Smoothing methods in statistics
- Density estimation with stagewise optimization of the empirical risk
- A Brief Survey of Bandwidth Selection for Density Estimation
- Robust Blind Source Separation by Beta Divergence
- Information Divergence Geometry and the Application to Statistical Machine Learning
- Robust and efficient estimation by minimising a density power divergence
- Boosting With theL2Loss
- Plug-in bandwidth matrices for bivariate kernel density estimation
- Information Geometry of U-Boost and Bregman Divergence
- Comparison of Smoothing Parameterizations in Bivariate Kernel Density Estimation
- Boosting kernel density estimates: A bias reduction technique?
- Cross-validation Bandwidth Matrices for Multivariate Kernel Density Estimation
- Looking for lumps: boosting and bagging for density estimation.
This page was built for publication: Density estimation with minimization of \(U\)-divergence