Conformal mirror descent with logarithmic divergences
From MaRDI portal
Publication:6138802
DOI10.1007/s41884-022-00089-3arXiv2209.02938MaRDI QIDQ6138802
Ting-Kam Leonard Wong, Amanjit Singh Kainth, Frank Rudzicz
Publication date: 16 January 2024
Published in: Information Geometry (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2209.02938
logarithmic divergenceconformal Hessian metricconformal mirror descentDirichlet optimal transportHessian gradient flow
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The geometry of relative arbitrage
- Isometric logratio transformations for compositional data analysis
- Information geometry and its applications
- Continuity, curvature, and the general covariance of optimal transportation
- Stability of a 4th-order curvature condition arising in optimal transport theory
- Geometry of minimum contrast
- Gradient systems in view of information geometry
- Logarithmic divergences from optimal transport and Rényi geometry
- Exponentially concave functions and a new information geometry
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Cramér-Rao lower bounds arising from generalized Csiszár divergences
- Projection theorems and estimating equations for power-law models
- Pseudo-Riemannian geometry encodes information geometry in optimal transport
- Multiplicative Schrödinger problem and the Dirichlet transport
- A gradient descent perspective on Sinkhorn
- Logarithmic divergences: geometry and interpretation of curvature
- A Regression Model for Compositional Data Based on the Shifted-Dirichlet Distribution
- On Conformal Divergences and Their Population Minimizers
- Rényi Divergence and Kullback-Leibler Divergence
- Generalised Thermostatistics
- Polar factorization and monotone rearrangement of vector‐valued functions
- A variational perspective on accelerated methods in optimization
- Hessian Riemannian Gradient Flows in Convex Programming
- Information Geometry of U-Boost and Bregman Divergence
- Information Geometry in Portfolio Theory
- Tsallis and Rényi Deformations Linked via a New λ-Duality
- A Relationship Between Arbitrary Positive Matrices and Doubly Stochastic Matrices
- Information Geometry
- Minimum Divergence Methods in Statistical Machine Learning
- The Information Geometry of Mirror Descent
- Information geometry
- Logistic regression, AdaBoost and Bregman distances
This page was built for publication: Conformal mirror descent with logarithmic divergences