By Maxim Raginsky, Department of Electrical and Computer Engineering, Coordinated Science Laboratory, University of Illinois at Urbana-Champaign, United States, maxim@illinois.edu | Igal Sason, Department of Electrical Engineering, Technion – Israel Institute of Technology, Israel, sason@ee.technion.ac.il
Concentration inequalities have been the subject of exciting developments during the last two decades, and have been intensively studied and used as a powerful tool in various areas. These include convex geometry, functional analysis, statistical physics, mathematical statistics, pure and applied probability theory (e.g., concentration of measure phenomena in random graphs, random matrices, and percolation), information theory, theoretical computer science, learning theory, and dynamical systems.
This monograph focuses on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding. In addition to being a survey, this monograph also includes various new recent results derived by the authors.
The first part of the monograph introduces classical concentration inequalities for martingales, as well as some recent refinements and extensions. The power and versatility of the martingale approach is exemplified in the context of codes defined on graphs and iterative decoding algorithms, as well as codes for wireless communication.
The second part of the monograph introduces the entropy method, an information-theoretic technique for deriving concentration inequalities for functions of many independent random variables. The basic ingredients of the entropy method are discussed first in conjunction with the closely related topic of logarithmic Sobolev inequalities, which are typical of the so-called functional approach to studying the concentration of measure phenomenon. The discussion on logarithmic Sobolev inequalities is complemented by a related viewpoint based on probability in metric spaces. This viewpoint centers around the so-called transportation-cost inequalities, whose roots are in information theory. Some representative results on concentration for dependent random variables are briefly summarized, with emphasis on their connections to the entropy method. Finally, we discuss several applications of the entropy method and related information-theoretic tools to problems in communications and coding. These include strong converses, empirical distributions of good channel codes with non-vanishing error probability, and an information-theoretic converse for concentration of measure.
Concentration inequalities have been the subject of exciting developments during the last two decades, and have been intensively studied and used as a powerful tool in various areas. These include convex geometry, functional analysis, statistical physics, mathematical statistics, pure and applied probability theory (e.g., concentration of measure phenomena in random graphs, random matrices, and percolation), information theory, theoretical computer science, learning theory, and dynamical systems.
Concentration of Measure Inequalities in Information Theory, Communications, and Coding focuses on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding. In addition to being a survey, this monograph also includes various new recent results derived by the authors.
Concentration of Measure Inequalities in Information Theory, Communications, and Coding is essential reading for all researchers and scientists in information theory and coding.
Concentration of Measure Inequalities in Information Theory, Communications, and Coding: Second Edition
This is the second edition of Concentration of Measure Inequalities in Information Theory, Communications, and Coding
ISBN: 978-1-60198-906-2