Mutual Information and Minimum Mean-Square Error in Gaussian Channels
From MaRDI portal
Publication:3546951
DOI10.1109/TIT.2005.844072zbMath1309.94099OpenAlexW2111616148MaRDI QIDQ3546951
Dongning Guo, Sergio Verdú, Shlomo Shamai
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/tit.2005.844072
smoothingWiener processmutual informationnonlinear filteringGaussian channeloptimal estimationminimum mean-square error (MMSE)
Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Channel models (including quantum) in information and communication theory (94A40)
Related Items
A concavity property for the reciprocal of Fisher information and its consequences on Costa's EPI, A stochastic successive minimization method for nonsmooth nonconvex optimization with applications to transceiver design in wireless communication networks, ENTROPY FLOW AND DE BRUIJN'S IDENTITY FOR A CLASS OF STOCHASTIC DIFFERENTIAL EQUATIONS DRIVEN BY FRACTIONAL BROWNIAN MOTION, The adaptive interpolation method for proving replica formulas. Applications to the Curie–Weiss and Wigner spike models, Strong replica symmetry in high-dimensional optimal Bayesian inference, Information theoretic limits of learning a sparse rule, Estimation of low-rank matrices via approximate message passing, An information-percolation bound for spin synchronization on general graphs, Statistical thresholds for tensor PCA, Statistical limits of spiked tensor models, Continuous trajectory planning of mobile sensors for informative forecasting, Perturbative construction of mean-field equations in extensive-rank matrix factorization and denoising, Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem, Almost Perfect Privacy for Additive Gaussian Privacy Filters, Minimum mutual information and non-Gaussianity through the maximum entropy method: theory and properties, An integral representation of the relative entropy, Informative windowed forecasting of continuous-time linear systems for mutual information-based sensor planning, Event-triggered remote state estimation over a collision channel with incomplete information, A DE BRUIJN'S IDENTITY FOR DEPENDENT RANDOM VARIABLES BASED ON COPULA THEORY, A Stein deficit for the logarithmic Sobolev inequality, Asymptotic mutual information for the balanced binary stochastic block model, REMARKS ON A SEMICIRCULAR PERTURBATION OF THE FREE FISHER INFORMATION, Is mutual information adequate for feature selection in regression?, An extended orthogonal forward regression algorithm for system identification using entropy, Heat equation and convolution inequalities, Some relations between mutual information and estimation error in Wiener space, Fundamental limits of symmetric low-rank matrix estimation, Information-Theoretic Bounds and Approximations in Neural Population Coding, Multi-target robust waveform design based on harmonic variance and mutual information, Community Detection and Stochastic Block Models, Quantifying information transmission in eukaryotic gradient sensing and chemotactic response, Mutual information for stochastic differential equations driven by fractional Brownian motion, The information-theoretic meaning of Gagliardo-Nirenberg type inequalities, Parametric Regularity of the Conditional Expectations via the Malliavin Calculus and Applications, On signalling and estimation limits for molecular birth-processes, Second-order converses via reverse hypercontractivity, Relations Between Information and Estimation in the Presence of Feedback