An Introduction to Information Theory: Symbols, Signals and Noise (Dover Books on Mathematics)
Behind the familiar surfaces of the telephone, radio, and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permeated the rapid development of all sorts of communication, from color television to the clear transmission of photographs from the vicinity of Jupiter. Even more revolutionary progress is expected in the future.
To give a solid introduction to this burgeoning field, J. R. Pierce has revised his well-received 1961 study of information theory for an up-to-date second edition. Beginning with the origins of the field, Dr. Pierce follows the brilliant formulations of Claude Shannon and describes such aspects of the subject as encoding and binary digits, entropy. language and meaning, efficient encoding , and the noisy channel. He then goes beyond the strict confines of the topic to explore the ways in which information theory relates to physics, cybernetics, psychology, and art. Mathematical formulas are introduced at the appropriate points for the benefit of serious students. A glossary of terms and an appendix on mathematical notation are provided to help the less mathematically sophisticated.
J. R. Pierce worked for many years at the Bell Telephone Laboratories, where he became Director of Research in Communications Principles. He is currently affiliated with the engineering department of the California Institute of Technology. While his background is impeccable, Dr. Pierce also possesses an engaging writing style that makes his book all the more welcome. An Introduction to Information Theory continues to be the most impressive non-technical account available and a fascinating introduction to the subject for laymen.
"An uncommonly good study. . . . Pierce's volume presents the most satisfying discussion to be found."― Scientific American.
Why Read This Book
You should read this book if you want a compact, readable introduction to the core ideas of information theory without getting lost in heavy mathematics. Pierce emphasizes intuition, historical context, and practical consequences for communication systems and the treatment of noise.
Who Will Benefit
Practicing engineers, graduate students, or advanced undergraduates who need an intuitive foundation in information theory to inform work in communications, coding, or statistical signal processing.
Level: Intermediate — Prerequisites: Basic calculus and probability; familiarity with binary digits and elementary communication-system concepts will help but deep prior theory is not required.
Key Takeaways
- Explain the concept of information and quantify it using entropy and related measures.
- Compute basic source coding efficiencies and understand the source coding theorem intuitively.
- Estimate and interpret channel capacity for noisy channels and the implications for reliable communication.
- Describe the role of redundancy and error-correcting ideas in combating noise in communication systems.
- Relate information-theoretic concepts to practical systems and appreciate historical and real-world examples.
Topics Covered
- Introduction and historical background
- Symbols, binary digits, and measures of information
- Entropy and information content
- Source coding and redundancy in languages
- Efficient encoding and compression ideas
- The noisy channel and channel capacity
- Error-correcting concepts and practical coding ideas
- Signal, noise, and the limits of communication
- Information, meaning, and communication systems
- Applications and concluding perspectives
How It Compares
Less formal and more conversational than Cover & Thomas's Elements of Information Theory; more approachable for engineers seeking intuition than the mathematically rigorous alternatives.












