Foundations and Trends® in Communications and Information Theory > Vol 5 > Issue 3

Universal Estimation of Information Measures for Analog Sources

By Qing Wang, Credit Suisse Group, USA, qingwang@Princeton.edu | Sanjeev R. Kulkarni, Department of Electrical Engineering, Princeton University, USA, Kulkarni@Princeton.edu | Sergio Verdú, Department of Electrical Engineering, Princeton University, USA, Verdu@Princeton.edu

 
Suggested Citation
Qing Wang, Sanjeev R. Kulkarni and Sergio Verdú (2009), "Universal Estimation of Information Measures for Analog Sources", Foundations and Trends® in Communications and Information Theory: Vol. 5: No. 3, pp 265-353. http://dx.doi.org/10.1561/0100000021

Publication Date: 27 May 2009
© 2009 Q. Wang, S. R. Kulkarni and S. Verdú
 
Subjects
Source coding
 

Free Preview:

Download extract

Share

Download article
In this article:
1. Introduction 
2. Plug-in Algorithms 
3. Algorithms Based on Partitioning 
4. Algorithms Based on k-Nearest-Neighbor Distances 
5. Other Algorithms 
6. Algorithm Summary and Experiments 
7. Sources with Memory 
References 

Abstract

This monograph presents an overview of universal estimation of information measures for continuous-alphabet sources. Special attention is given to the estimation of mutual information and divergence based on independent and identically distributed (i.i.d.) data. Plug-in methods, partitioning-based algorithms, nearest-neighbor algorithms as well as other approaches are reviewed, with particular focus on consistency, speed of convergence and experimental performance.

DOI:10.1561/0100000021
ISBN: 978-1-60198-230-8
120 pp. $85.00
Buy book (pb)
 
ISBN: 978-1-60198-231-5
120 pp. $100.00
Buy E-book (.pdf)
Table of contents:
1. Introduction
2. Plug-in Algorithms
3. Algorithms Based on Partitioning
4. Algorithms based on k-Nearest-Neighbor Distances
5. Other Algorithms
6. Algorithm Summary and Experiments
7. Sources with Memory
References

Universal Estimation of Information Measures for Analog Sources

Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume knowledge of the statistical properties of the observed data. Over the past few decades, several nonparametric algorithms have been proposed to estimate information measures.

Universal Estimation of Information Measures for Analog Sources presents a comprehensive survey of universal estimation of information measures for memoryless analog (real-valued or real vector-valued) sources with an emphasis on the estimation of mutual information and divergence and their applications. The book reviews the consistency of the universal algorithms and the corresponding sufficient conditions as well as their speed of convergence.

Universal Estimation of Information Measures for Analog Sources provides a comprehensive review of an increasingly important topic in Information Theory. It will be of interest to students, practitioners and researchers working in Information Theory

 
CIT-021