DSPRelated.com
Books

Information Theory (Dover Books on Mathematics)

Robert B. Ash 1990

Developed by Claude Shannon and Norbert Wiener in the late Forties, information theory, or statistical communication theory, deals with the theoretical underpinnings of a wide range of communication devices: radio, television, radar, computers, telegraphy, and more. This book is an excellent introduction to the mathematics underlying the theory.
Designed for upper-level undergraduates and first-year graduate students, the book treats three major areas: analysis of channel models and proof of coding theorems (Chapters 3, 7 and 8); study of specific coding systems (Chapters 2, 4, and 5); and study of statistical properties of information sources (Chapter 6). Among the topics covered are noiseless coding, the discrete memoryless channel, error correcting codes, information sources, channels with memory and continuous channels.
The author has tried to keep the prerequisites to a minimum. However, students should have a knowledge of basic probability theory. Some measure and Hilbert space theory is helpful as well for the last two sections of Chapter 8, which treat time-continuous channels. An appendix summarizes the Hilbert space background and the results from the theory of stochastic processes necessary for these sections. The appendix is not self-contained, but will serve to pinpoint some of the specific equipment needed for the analysis of time-continuous channels.
In addition to historic notes at the end of each chapter indicating the origin of some of the results, the author has also included 60 problems, with detailed solutions, making the book especially valuable for independent study.


Why Read This Book

You should read this book if you want a clear, mathematically rigorous foundation in the core ideas of information theory that underlie DSP, communications, and coding. You will learn the definitions and proofs behind entropy, mutual information, channel capacity and basic coding theorems, building intuition that connects statistical source behaviour to practical coding and communication limits.

Who Will Benefit

Upper‑level undergraduates, first‑year graduate students, and practicing engineers in communications, DSP, or signal processing who need a compact, theorem‑driven introduction to information theory.

Level: Intermediate — Prerequisites: Calculus, linear algebra, and a working knowledge of probability and random variables (basic measure theory not required); familiarity with basic signals/communication concepts is helpful.

Get This Book

Key Takeaways

  • Define and manipulate core information measures such as entropy, conditional entropy, relative entropy (Kullback–Leibler divergence), and mutual information
  • Prove and apply the noiseless source coding theorems and construct practical source codes (e.g., prefix codes)
  • Analyze discrete memoryless channels and derive channel capacity and the fundamental channel coding theorems
  • Understand basic error‑correcting code concepts, performance bounds, and how coding relates to channel reliability
  • Characterize statistical properties of information sources (Markov sources, ergodicity) and relate source statistics to coding efficiency

Topics Covered

  1. 1. Introduction and Mathematical Preliminaries
  2. 2. Measures of Information: Entropy and Relative Entropy
  3. 3. Noiseless Source Coding and Coding Theorems
  4. 4. Practical Source Coding Methods (prefix codes, efficiency)
  5. 5. Channel Models and the Discrete Memoryless Channel
  6. 6. Channel Capacity and Fundamental Coding Theorems
  7. 7. Error‑Correcting Codes and Performance Bounds
  8. 8. Statistical Properties of Information Sources (Markov chains, ergodicity)
  9. 9. Applications and Examples in Communication Systems
  10. Appendices: Mathematical Tools and Proof Techniques

How It Compares

More concise and theorem‑focused than Cover & Thomas' Elements of Information Theory (which is broader and more modern); less application/algorithm oriented than David MacKay's text, but excellent as a rigorous, compact classical foundation.

Related Books