Estimating Entropy Rates with Bayesian Confidence Intervals
From MaRDI portal
Publication:3025070
DOI10.1162/0899766053723050zbMath1064.62005OpenAlexW2102985846WikidataQ46052003 ScholiaQ46052003MaRDI QIDQ3025070
Jonathon Shlens, Henry D. I. Abarbanel, E. J. Chichilnisky, Matthew B. Kennel
Publication date: 4 July 2005
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://escholarship.org/uc/item/9243v6dr
Parametric tolerance and confidence regions (62F25) Bayesian inference (62F15) Monte Carlo methods (65C05) Neural biology (92C20) Statistical aspects of information-theoretic topics (62B10)
Related Items
Coincidences and estimation of entropies of random variables with large cardinalities ⋮ Synergy, redundancy, and multivariate information measures: an experimentalist's perspective ⋮ Variance estimators for the Lempel-Ziv entropy rate estimator ⋮ Indices for Testing Neural Codes ⋮ A Locally Optimal Algorithm for Estimating a Generating Partition from an Observed Time Series and Its Application to Anomaly Detection ⋮ Tight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection Techniques ⋮ On the permutation entropy Bayesian estimation ⋮ A Finite-Sample, Distribution-Free, Probabilistic Lower Bound on Mutual Information ⋮ The permutation entropy rate equals the metric entropy rate for ergodic information sources and ergodic dynamical systems ⋮ Predicting the synaptic information efficacy in cortical layer 5 pyramidal neurons using a minimal integrate-and-fire model ⋮ Information in the Nonstationary Case ⋮ Model-Based Decoding, Information Estimation, and Change-Point Detection Techniques for Multineuron Spike Trains ⋮ Estimating Information Rates with Confidence Intervals in Neural Spike Trains ⋮ Bayesian and quasi-Bayesian estimators for mutual information from discrete data ⋮ Optimal instruments and models for noisy chaos ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Information processing in the LGN: a comparison of neural codes and cell types
Cites Work
- Unnamed Item
- A Mathematical Theory of Communication
- Estimating the errors on measured entropy and mutual information
- Predictability, Complexity, and Learning
- Statistical Inference, Occam's Razor, and Statistical Mechanics on the Space of Probability Distributions
- Linear Time Universal Coding and Time Reversal of Tree Sources Via FSM Closure
- Universal Compression of Memoryless Sources Over Unknown Alphabets
- The performance of universal encoding
- On the Complexity of Finite Sequences
- A universal algorithm for sequential data compression
- Compression of individual sequences via variable-rate coding
- Metric-space analysis of spike trains: theory, algorithms and application
- The context-tree weighting method: extensions
- Nonparametric entropy estimation for stationary processes and random fields, with applications to English text
- Entropy estimation of symbol sequences
- On the role of pattern matching in information theory
- Estimation of Entropy and Mutual Information
- Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity
- The context-tree weighting method: basic properties
- Geodesic Entropic Graphs for Dimension and Entropy Estimation in Manifold Learning
- Monte Carlo sampling methods using Markov chains and their applications
- A formal theory of inductive inference. Part I