Quantifying Stimulus Discriminability: A Comparison of Information Theory and Ideal Observer Analysis
From MaRDI portal
Publication:5706662
DOI10.1162/0899766053429435zbMath1173.94302OpenAlexW2120057264WikidataQ36097310 ScholiaQ36097310MaRDI QIDQ5706662
Eric E. Thomson, William B. Jun. Kristan
Publication date: 21 November 2005
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/0899766053429435
Neural networks for/in biological studies, artificial life and related topics (92B20) Research exposition (monographs, survey articles) pertaining to information and communication theory (94-02) Measures of information, entropy (94A17)
Related Items
A Causal Perspective on the Analysis of Signal and Noise Correlations and Their Role in Population Coding ⋮ Indices for Testing Neural Codes ⋮ Quantifying Neurotransmission Reliability Through Metrics-Based Information Analysis ⋮ Pursuit of food \textit{versus} pursuit of information in a Markovian perception-action loop model of foraging ⋮ Coordinate invariance as a fundamental constraint on the form of stimulus-specific information measures
Cites Work
- Unnamed Item
- Predictability, Complexity, and Learning
- Always Good Turing: Asymptotically Optimal Probability Estimation
- Estimating Entropy on<tex>$m$</tex>Bins Given Fewer Than<tex>$m$</tex>Samples
- On the relationship between the information measures and the Bayes probability of error
- Relations between entropy and error probability
- Temporal aspects of neural coding in the retina and lateral geniculate
- Some remarks concerning uncertainty and the probability of error (Corresp.)