scientific article; zbMATH DE number 7255170
From MaRDI portal
Publication:4969253
Publication date: 5 October 2020
Full work available at URL: https://arxiv.org/abs/1906.11148
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
neural networkschaining mutual informationmultilevel relative entropymultiscale generalization boundmultiscale Gibbs distribution
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- From \(\varepsilon\)-entropy to KL-entropy: analysis of minimum information complexity density estima\-tion
- On prediction of individual sequences
- On the properties of variational approximations of Gibbs posteriors
- Rényi Divergence and Kullback-Leibler Divergence
- Probability and Stochastics
- Information-theoretic upper and lower bounds for statistical estimation
- Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
- Learners that Use Little Information
- High-Dimensional Probability
- Fifty years of Shannon theory
- Upper and Lower Bounds for Stochastic Processes
- Sparse estimation by exponential weighting
This page was built for publication: