A scaling law from discrete to continuous solutions of channel capacity problems in the low-noise limit
From MaRDI portal
Publication:2315169
DOI10.1007/s10955-019-02296-2zbMath1435.94110arXiv1710.09351OpenAlexW2943353921WikidataQ127966118 ScholiaQ127966118MaRDI QIDQ2315169
Benjamin B. Machta, Michael C. Abbott
Publication date: 1 August 2019
Published in: Journal of Statistical Physics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1710.09351
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Estimating a bounded normal mean
- Constrained minimax estimation of the mean of the normal distribution with known variance
- Jeffreys' prior is asymptotically least favorable under entropy risk
- Combining independent Bayesian posteriors into a confidence distribution, with application to estimating climate sensitivity
- On a Measure of the Information Provided by an Experiment
- On calculating the capacity of an infinite-input finite (infinite)-output channel
- A general minimax result for relative entropy
- Surprises in Numerical Expressions of Physical Constants
- Discrete Actions in Information-Constrained Decision Problems
- Uniform Approximation of Minimax Point Estimates
- The information capacity of amplitude- and variance-constrained sclar gaussian channels
- An algorithm for computing the capacity of arbitrary discrete memoryless channels
- Computation of channel capacity and rate-distortion functions
This page was built for publication: A scaling law from discrete to continuous solutions of channel capacity problems in the low-noise limit