Entropy factor for randomness quantification in neuronal data
From MaRDI portal
Publication:2179072
DOI10.1016/j.neunet.2017.07.016zbMath1439.92019OpenAlexW2749209900WikidataQ48009696 ScholiaQ48009696MaRDI QIDQ2179072
Lubomir Kostal, Kamil Rajdl, Petr Lansky
Publication date: 12 May 2020
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2017.07.016
Neural networks for/in biological studies, artificial life and related topics (92B20) Measures of information, entropy (94A17)
Related Items (2)
The Jacobi diffusion process as a neuronal model ⋮ On two diffusion neuronal models with multiplicative noise: The mean first-passage time properties
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The gamma renewal process as an output of the diffusion leaky integrate-and-fire neuronal model
- The effect of interspike interval statistics on the information gain under the rate coding hypothesis
- Fano factor estimation
- An introductory review of information theory in the context of computational neuroscience
- Statistical structure of neural spiking under non-Poissonian or other non-white stimulation
- Similarity of interspike interval distributions and information gain in a stationary neuronal firing
- Measures of statistical dispersion based on Shannon and Fisher information concepts
- Optimizing Time Histograms for Non-Poissonian Spike Trains
- Bias analysis in entropy estimation
- Estimating Instantaneous Irregularity of Neuronal Firing
- Spiking Neuron Models
- Firing Variability Is Higher than Deduced from the Empirical Coefficient of Variation
- Impact of Spike Train Autostructure on Probability Distribution of Joint Spike Events
- Elements of Information Theory
- The Properties of Recurrent-Event Processes
This page was built for publication: Entropy factor for randomness quantification in neuronal data