Measuring time-frequency information content using the Renyi entropies
From MaRDI portal
Publication:4544575
DOI10.1109/18.923723zbMath0997.94533OpenAlexW2100620757MaRDI QIDQ4544575
Patrick Flandrin, Richard G. Baraniuk, Augustus J. E. M. Janssen, Olivier J. J. Michel
Publication date: 4 August 2002
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/18.923723
Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10)
Related Items (13)
Factorization property of convolutions of white noise operators ⋮ High order statistics and time-frequency domain to classify heart sounds for subjects under cardiac stress test ⋮ Characterization of time series via Rényi complexity-entropy curves ⋮ Bayesian analysis of the inverse generalized gamma distribution using objective priors ⋮ Adaptive synchrosqueezing transform with a time-varying parameter for non-stationary signal separation ⋮ Distinguishing stationary/nonstationary scaling processes using wavelet Tsallis \(q\)-entropies ⋮ Rényi entropy of fuzzy dynamical systems ⋮ On scale and concentration invariance in entropies. ⋮ Robust coding for a class of sources: Applications in control and reliable communication over limited capacity channels ⋮ Wavelets and statistical analysis of functional magnetic resonance images of the human brain ⋮ A survey of uncertainty principles and some signal processing applications ⋮ On some entropy functionals derived from Rényi information divergence ⋮ Upper bounds on Shannon and Rényi entropies for central potentials
This page was built for publication: Measuring time-frequency information content using the Renyi entropies