A convenient infinite dimensional framework for generative adversarial learning
From MaRDI portal
Publication:2683193
DOI10.1214/23-EJS2104MaRDI QIDQ2683193
Matthias Rottmann, Hayk Asatryan, Hanno Gottschalk, Marieke Lippert
Publication date: 3 February 2023
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2011.12087
statistical learning theorychainingcovering numbers for Hölder spacesgenerative adversarial learninginverse Rosenblatt transformation
Asymptotic properties of nonparametric inference (62G20) Learning and adaptive systems in artificial intelligence (68T05)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Partial differential equations. I: Basic theory
- Elliptic partial differential equations of second order
- Weak convergence and empirical processes. With applications to statistics
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- Some theoretical properties of GANs
- Error bounds for approximations with deep ReLU networks
- Dependence Modeling with Copulas
- Mathematical Foundations of Infinite-Dimensional Statistical Models
- Polar factorization and monotone rearrangement of vector‐valued functions
- Equivalence of approximation by convolutional neural networks and fully-connected networks
- Advanced Lectures on Machine Learning
- Understanding Machine Learning
- Monte Carlo sampling methods using Markov chains and their applications
- Probability-1
- Remarks on a Multivariate Transformation