Some Universal Insights on Divergences for Statistics, Machine Learning and Artificial Intelligence
From MaRDI portal
Publication:4967759
DOI10.1007/978-3-030-02520-5_8zbMath1420.62016OpenAlexW2900826962MaRDI QIDQ4967759
Michel Broniatowski, Wolfgang Stummer
Publication date: 10 July 2019
Published in: Geometric Structures of Information (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/978-3-030-02520-5_8
Point estimation (62F10) Robustness and adaptive procedures (parametric inference) (62F35) Learning and adaptive systems in artificial intelligence (68T05) Statistical aspects of information-theoretic topics (62B10)
Related Items (4)
On reconsidering entropies and divergences and their cumulative counterparts: Csiszár's, DPD's and Fisher's type cumulative and survival measures ⋮ On a cornerstone of bare-simulation distance/divergence optimization ⋮ Aggregated tests based on supremal divergence estimators for non-regular statistical models ⋮ Optimal transport with some directed distances
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Computational Optimal Transport: With Applications to Data Science
- Robust Bayes estimation using the density power divergence
- On local divergences between two probability measures
- Geometric science of information. Second international conference, GSI 2015, Palaiseau, France, October 28--30, 2015. Proceedings
- The geometry of relative arbitrage
- Geometric science of information. First international conference, GSI 2013, Paris, France, August 28--30, 2013. Proceedings
- The power divergence and the density power divergence families: the mathematical connection
- Decomposable pseudodistances and applications in statistical estimation
- Dual divergence estimators and tests: robustness results
- On testing local hypotheses via local divergence
- Information geometry and its applications
- Statistical decision theory. Estimation, testing, and selection.
- Parametric estimation and tests through divergences and the duality technique
- Goodness-of-fit statistics for discrete multivariate data
- Theory of statistical inference and information. Transl. from the Slovak by the author
- Geometric science of information. Third international conference, GSI 2017, Paris, France, November 7--9, 2017. Proceedings
- Information geometry of Wasserstein divergence
- Information geometry under monotone embedding. I: Divergence functions
- 3D insights to some divergences for robust statistics and machine learning
- Some new flexibilizations of Bregman divergences and their asymptotics
- Bregman divergences from comparative convexity
- \(k\)-means clustering with Hölder divergences
- Information geometry connecting Wasserstein distance and Kullback-Leibler divergence via the entropy-relaxed transportation problem
- Exponentially concave functions and a new information geometry
- Minimum disparity estimators for discrete and continuous models.
- Towards a better understanding of the dual representation of phi divergences
- Minimum disparity estimation for continuous models: Efficiency, distributions and robustness
- Weak convergence and empirical processes. With applications to statistics
- Density-ratio matching under the Bregman divergence: a unified framework of density-ratio estimation
- A general set up for minimum disparity estimation
- Robust tests for the equality of two normal means based on the density power divergence
- A generalized divergence for statistical inference
- Robust estimation in generalized linear models: the density power divergence approach
- Some Results on the Curse of Dimensionality and Sample Size Recommendations
- New Model Search for Nonlinear Recursive Models, Regressions and Autoregressions
- Matrix Information Geometry
- Several Applications of Divergence Criteria in Continuous Families
- Robust Statistical Engineering by Means of Scaled Bregman Distances
- On Conformal Divergences and Their Population Minimizers
- On Efficient Estimation in Continuous Models Based on Finitely Quantized Observations
- Minimization of φ-divergences on sets of signed measures
- On Divergences and Informations in Statistics and Information Theory
- Robust and efficient estimation by minimising a density power divergence
- Statistical information and discrimination
- Robust Estimation for Grouped Data
- Two approaches to grouping of data and related disparity statistics
- A new toolkit for robust distributional change detection
- Information Geometry of U-Boost and Bregman Divergence
- Multivariate Density Estimation
- On Bregman Distances and Divergences of Probability Measures
- Information Geometry
- Information, Divergence and Risk for Binary Experiments
- On divergences of finite measures and their applicability in statistics and information theory
- Some Decision Procedures Based on Scaled Bregman Distance Surfaces
- Prediction, Learning, and Games
- Robust and efficient estimation under data grouping
- Uncertainty, Information, and Sequential Experiments
- Statistical Inference
- Minimum divergence estimators based on grouped data
- Logistic regression, AdaBoost and Bregman distances
- Bregman Voronoi diagrams
This page was built for publication: Some Universal Insights on Divergences for Statistics, Machine Learning and Artificial Intelligence