The Laws of Cryptography:
Coding and Information Theory

by Neal R. Wagner

Copyright © 2001 by Neal R. Wagner. All rights reserved.

NOTE: This site is obsolete. See book draft (in PDF):

Law ENTROPY1: The entropy of a message is just the number of bits of information in the message, that is, the number of bits needed for the shortest possible encoding of the message.

Law ENTROPY2: A random message has the most information (the greatest entropy). [Shannon]

Law INFORMATION1: In all coding theory, information transmission is essentially the same as information storage, since the latter is just transmission from now to then.

Law SHANNON1: Over a noisy channel it is always possible to use a long enough random code to signal arbitrarily close to the channel capacity with arbitrarily good reliability [also know as Shannon's Noisy Coding Theorem].

Revision date: 2002-01-05. (Please use ISO 8601, the International Standard.)