Foundations and Trends® in Signal Processing > Vol 6 > Issue 4

The Interplay Between Information and Estimation Measures

By Dongning Guo, Northwestern University, USA, dGuo@northwestern.edu | Shlomo Shamai (Shitz), Technion – Israel Institute of Technology, Israel, sshlomo@ee.technion.ac.il | Sergio Verdú, Princeton University, USA, verdu@princeton.edu

 
Suggested Citation
Dongning Guo, Shlomo Shamai (Shitz) and Sergio Verdú (2013), "The Interplay Between Information and Estimation Measures", Foundations and Trends® in Signal Processing: Vol. 6: No. 4, pp 243-429. http://dx.doi.org/10.1561/2000000018

Publication Date: 28 Nov 2013
© 2013 D. Guo, S. Shamai, and S. Verdú
 
Subjects
Detection and estimation,  Information theory and statistics,  Multiuser information theory,  Shannon theory,  Signal processing for communications,  Wireless communications,  Linear and nonlinear filtering,  Nonlinear signal processing,  Signal processing for communications,  Statistical signal processing,  Brownian models,  Point processes
 

Free Preview:

Download extract

Share

Download article
In this article:
1. Introduction 
2. Basic Information and Estimation Measures 
3. Properties of the MMSE in Gaussian Noise 
4. Mutual Information and MMSE 
5. Mutual Information and MMSE in Discrete- and Continuous-time Gaussian Channels 
6. Entropy, Relative Entropy, Fisher Information, and Mismatched Estimation 
7. Applications of I–MMSE 
8. Information and Estimation Measures in Poisson Models and Channels 
9. Beyond Gaussian and Poisson Models 
10. Outlook 
Acknowledgments 
Appendices 
References 

Abstract

This monograph surveys the interactions between information measures and estimation measures as well as their applications. The emphasis is on formulas that express the major information measures, such as entropy, mutual information and relative entropy in terms of the minimum mean square error achievable when estimating random variables contaminated by Gaussian noise. These relationships lead to wide applications ranging from a universal relationship in continuous time nonlinear filtering to optimal power allocation in communication systems, to the simplified proofs of important results in information theory such as the entropy power inequality and converses in multiuser information theory.

DOI:10.1561/2000000018
ISBN: 978-1-60198-748-8
212 pp. $99.00
Buy book (pb)
 
ISBN: 978-1-60198-749-5
212 pp. $230.00
Buy E-book (.pdf)
Table of contents:
1. Introduction
2. Basic Information and Estimation Measures
3. Properties of the MMSE in Gaussian Noise
4. Mutual Information and MMSE
5. Mutual Information and MMSE in Discrete- and Continuous-time Gaussian Channels
6. Entropy, Relative Entropy, Fisher Information, and Mismatched Estimation
7. Applications of I–MMSE
8. Information and Estimation Measures in Poisson Models and Channels
9. Beyond Gaussian and Poisson Models
10. Outlook
Acknowledgments
Appendices
References

The Interplay Between Information and Estimation Measures

If information theory and estimation theory are thought of as two scientific languages, then their key vocabularies are information measures and estimation measures, respectively. The basic information measures are entropy, mutual information and relative entropy. Among the most important estimation measures are mean square error (MSE) and Fisher information. Playing a paramount role in information theory and estimation theory, those measures are akin to mass, force and velocity in classical mechanics, or energy, entropy and temperature in thermodynamics.

The Interplay Between Information and Estimation Measures is intended as handbook of known formulas which directly relate to information measures and estimation measures. It provides intuition and draws connections between these formulas, highlights some important applications, and motivates further explorations. The main focus is on such formulas in the context of the additive Gaussian noise model, with lesser treatment of others such as the Poisson point process channel. Also included are a number of new results which are published here for the first time. Proofs of some basic results are provided, whereas many more technical proofs already available in the literature are omitted. In 2004, the authors of this monograph found a general differential relationship commonly referred to as the I–MMSE formula. In this book a new, complete proof for the I–MMSE formula is developed, which includes some technical details omitted in the original papers relating to this.

The Interplay Between Information and Estimation Measures concludes by highlighting the impact of the information–estimation relationships on a variety of information-theoretic problems of current interest, and provide some further perspective on their applications.

 
SIG-018