A maximum entropy criterion of filtering and coding for stationary autoregressive signals: Its physical interpretations and suggestions for its application to neural information transmission
DOI10.1007/BF00449592zbMath0576.92015WikidataQ42109907 ScholiaQ42109907MaRDI QIDQ1064982
Kojiro Aya, Hiroshi Nakahama, Hisashi Fujii
Publication date: 1985
Published in: Biological Cybernetics (Search for Journal in Brave)
convolutionfilteringdecodingwhite noisedeconvolutionencodingmaximum entropy criterionfeedback systemautoregressive model of information transmissioncontinuous communication systemGaussian signal processingpower transmission in thermodynamics
Inference from stochastic processes and prediction (62M20) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10) Coding theorems (Shannon theory) (94A24) Other natural sciences (mathematical treatment) (92F05) Physiological, cellular and medical topics (92Cxx)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Dependency as a measure to estimate the order and the values of Markov processes
- Dependency representing Markov properties of nonstationary spike trains recorded from the cat's optic tract fibers
- Note on completeness theorems of Paley-Wiener type
- Information Theory and Statistical Mechanics
- Markov dependency based on Shannon's entropy and its application to neural spike trains
- A new look at the statistical model identification
This page was built for publication: A maximum entropy criterion of filtering and coding for stationary autoregressive signals: Its physical interpretations and suggestions for its application to neural information transmission