Entropy in Information Theory (of Claude Shannon) and Coding Theory


Chris Hillman hillman@math.washington.edu

This is local copy of  the Original: http://www.math.washington.edu/~hillman/Entropy/infcode.html


In 1948, motivated by the problem of efficiently transmitting information over a noisy communication channel, Claude Shannon introduced a revolutionary new probabilistic way of thinking about communication and simultaneously created the first truly mathematical theory of entropy. His ideas created a sensation and were rapidly developed along two main lines of development: information theory, which employs probability and ergodic theory to study the statistical characteristics of data and communication systems, and coding theory, which uses mainly algebraic and geometric tools to contrive efficient codes for various situations. However, while the methods of the two fields are different, they are spiritually so closely related that it would be misleading to give them two seperate pages on this website.

* Thanks to Emre Telatar and Lucent Technologies (formerly Bell Labs, where Shannon worked for many years), the full text of Shannon's classic 1948 paper, A Mathematical Theory of Communication, is now available electronically. This paper is very readable and Parts I and II are still in many ways the still best introduction to the modern concept of entropy. Highest recommendation!

Here are some truly fabulous expository papers which I think give an excellent overview of this whole area:

If you've forgotten what a logarithm is, you might want to start instead with the Primer on Information Theoryby Thomas D. Schneider (Laboratory of Molecular Biology, NIH), which gives a very gentle introduction to Shannon's discrete entropy.

Here are some expository papers giving a nice overview of coding theory:

For the serious student of coding theory, here are some longer expository works, including some book length textbooks:

Further Reading

Approximately 200 books on information and coding theory have been published since Shannon's seminal paper. See the list of textbooks in this area maintained by Werner Heise (Mathematics, Technische Universitaet, Munich).

Some of the most recent textbooks (plus one classic) include the following:

Ioannis Kontoyiannis (Statistics, Electrical and Computer Engineering, Purdue) has compiled another bibliography of suggested reading in this field.