×

Information theory. Coding theorems for discrete memoryless systems. 2nd ed. (English) Zbl 1256.94002

Cambridge: Cambridge University Press (ISBN 978-0-521-19681-9/hbk). xxi, 499 p. (2011).
Information theory stems from Claude Shannon’s work in 1948. Shannon was motivated by problems in communication theory. Information theory is a branch of applied mathematics involving the statistical aspects of transmission and processing of information. It deals with the measurement of information, the representation of information, and the capacity of communication systems to transmit and process information. This book is the second edition of the classic book; for the first edition see [Zbl 0568.94012].
In this book, the authors present various important results concerning discrete memoryless communication systems using mostly a combinatorial approach. The book is composed of three parts. The first part is on information measures in simple coding problems. This part is made up of five chapters. Chapter 1 introduces measures of information such as Shannon entropy, informational divergence and mutual information in the context of source coding and hypothesis testing. The second chapter covers the preliminaries to the combinatorial approach that will be used for studying discrete memoryless communication systems. In this chapter the authors develop some techniques based on a few simple combinatorial lemmas. Chapter 3 deals with the formal properties of Shannon information measures while Chapter 4 treats non-block source coding. Chapter 5 focuses on the blowing-up lemma.
The second part of the book deals with some coding theorems for discrete memoryless systems for two-terminal systems. Chapters 6 through 12 cover two-terminal systems. Chapter 6 covers noisy channel coding problems. It contains Shannon’s noisy channel coding theorem, weak and strong converse theorems, and various generalizations. In Chapter 7, the authors treat the problem of constructing source codes meeting a given fidelity criterion and having distortion rates as small as possible. The computation of channel capacity and an algorithm for computing the distortion rate are presented in Chapter 8. In Chapter 9, the covering lemma and results concerning the error exponent in source coding are presented, while in Chapter 10, a packing lemma and some results related to the error exponent in channel coding are treated. Chapter 11 is a new chapter in this second edition and covers zero-error problems and their connection to combinatorics. Chapter 12 deals with arbitrarily varying channels.
The third part of this book deals with multi-terminal systems and consists of five chapters, namely Chapters 13 through 17. Chapter 13 focusses on a general set-up for multi-terminal source coding problems. A discrete memoryless multiple source assumes a probabilistic dependence between components of vectors which is emitted at each instant. Separate encoding-decoding of the component sources yields worse codes of the joint source than joint encoding-decoding. Some in-between cases are investigated. The separate coding of correlated sources is analyzed. Chapter 14 treats multiple-access channels. In this chapter the authors present the solution of a channel coding problem for a simple network and then introduce a general class of channel coding problems for multi-terminal networks. Chapter 15 solves the entropy characterization problem and the image size characterization problem. Chapter 16 deals with source and channel networks. Chapter 17 is new in this edition and deals with information-theoretic security.
The book is intended for graduate students in electrical engineering, computer science, and applied mathematics. The presentation is lucid, the proofs are rigorous, and topics are still relevant. Each chapter contains a section where many problems, along with helpful hints for their solution, are presented to broaden the readers’ understanding of the subject matter. Illustrative examples are scarce in this book and this may be the only thing lacking in this classic book. Just like the first edition, this new edition will be very useful to practitioners and others engaged in information and communication theory.

MSC:

94-02 Research exposition (monographs, survey articles) pertaining to information and communication theory
94A15 Information theory (general)
94A17 Measures of information, entropy
94A24 Coding theorems (Shannon theory)
94A29 Source coding
94A34 Rate-distortion theory in information and communication theory
94A40 Channel models (including quantum) in information and communication theory
Full Text: DOI