schaffer at SPAMoptonline.net
Tue Feb 27 22:17:02 MST 2001
me leave in silly quote from a journalist:
> Entropy is information."
and Charles then rightly wondered:
> Somehow I would have thought of information as order, not entropy?
technically, in information theory, the information content of a message
(outcome of an experiment, data, signal, etc) is a _measure_ of the change
in uncertainty in some property as a result of receipt of the message ( ...,
i shouldnt have included this part of the quote, i was more interested in
the historical perspective.
here is from _Information Theory_, by Robert Ash (my comments in ''):
The measure of information
Consider the following experiment. Two coins are available, one unbiased
[one side heads, one side tails, X = 0] and the other two-headed [both sides
are heads, X = 1]. A coin is selected at random [from the pair] and tossed
twice, and the number of heads is recorded. We ask how much information is
conveyed about the identity of the coin by the number of heads obtained. It
is clear that the number of heads does tell us something about the nature of
the coin. If less than 2 heads are obtained, the unbiased coin must have
been used; if both throws resulted in heads, the evidence favors the
two-headed coin. In accordance with the discussion at the beginning of this
chapter [on how to quantitatively describe the uncertainty in the property
of some system], we decide to measure information as a reduction in
uncertainty. To be specific ...
and then he goes on to describe how information can be defined as the
difference between uncertainty in knowledge of the coin _before_ the tosses
minus the uncertainty _after_ the coin is tossed twice, and then launches
into a simple calculation describing the probabilities of various outcomes
of the coin choices and tosses, etc.
Interestingly, Ash uses the term "entropy" only once in the whole book (at
least according to the index and my memory of his discussions), almost as a
The quantity H(X) [the quantitative uncertainty function], which we have
refered to as the "uncertainty of X" [how unsure we are about which coin we
picked randomly, i.e., is X = 0 or is X = 1?], has also been called the
"entropy" or "communication entropy" of X.
whereas Shannon's original paper on info theory used the term quite freely.
Entropy is a natural term to use in the following respect, the H function is
similar in form to entropy functions defined in statistical physics
One of the motivations for developing the theory of information was in
understanding the efficacy of basic communication schemes along noisy
channels, where a signal can be corrupted by some external process about
which we have little information (noise) -- did s/he say
"GCAATTGCATTGAAAAGA"? or did s/he say "GCAATTGCAGTGAAAAGA"?
More information about the Marxism