RE: ABCDEFGHIJKMNOPQRSTUVWXYZ

Brian D Harper (bharper@postbox.acs.ohio-state.edu)
Tue, 30 Dec 1997 00:17:23 -0500

Hmmm... since John has agreed to say no more and Glenn
has promised to shut up [and who says miracles can't
happen :)] it seems it may be safe for me to speculate
wildly with no fear of rebuttal [fat chance].

At 05:45 AM 12/29/97 -0600, John wrote:
>Brian,
>
>Surprise is a -very- subjective thing. Is mathematical "information"
>similarly subjective? (Cf. my note to Glenn about the "subjectivity" of
>compressibility.)
>

No, one advantage of information theoretic information is its
objectivity. The problem is only with my analogy in terms of
surprise. This only works as a rough analogy when surprise
comes about through having supposed patterns broken.

But patterns aren't everything. For example, it might not
be obvious that there is a pattern in the following
sequence

1 2 3 5 8 13 21 34 55 89 144 233 377 610 987 ...

yet this is the Fibonacci series and would have small
info content since it can be generated by a simple
recursive formula. Similarly, the number Pi and the
Mandelbrot set are simple.

<<sidelight: here's an extra credit problem for any
adaptationists in the crowd. Successive pairs of the
Fibonacci series define the phyllotaxy patterns of
leaves on a plant. In the case of spiral phyllotaxis,
successive leaves are located at angles that
divide the meristem in proportions of the Golden
Section. How would natural selection account for
such a precise arrangement? Or is it design ;-) >>
[I really shouldn't be this naughty, but Christmas
is over and Santa has a short memory :)]

Before I go on I should mention that Glenn and I
have been discussing algorithmic information
theory (AIT) which is somewhat different than
Shannon info theory. In AIT, algorithmic complexity
algorithmic information content and Kolmogorov
complexity all refer to the same measure. Perhaps
the best way of thinking of it is in terms of
"descriptive complexity". The complexity of an
object is equal to the shortest description
that completely specifies the object. In AIT,
the descriptions are algorithms. A good source
of information on AIT is Chaitin's web page:

http://www.cs.auckland.ac.nz/CDMTCS/chaitin/

which contains html versions of many of his papers.
Chaitin is one of the founders of AIT (when still in
high school!!)

You raised some interesting questions that the
pioneers of AIT had to struggle with. One is that
the shortest algorithm depends on the computer
that you use. For example, "fancy" computers will
have a lot of stuff "built in", defined functions
etc. Suppose you have a special computer with the
Encyclopedia Britannica hardwired in. Then all
you have to do is call it with a special function

GO EB

The way around this is to define things with some
standard computer (Universal Turing Machine).
Chaitin has been working recently on a new version
of LISP in order to iron out some of the technical
difficulties involved with practical applications
of AIT.

But AIT is very useful in just ordering various things
in terms of increasing complexity (this is more complex
than that) without worrying about the actual number
of bits. In this case one can get reliable results just
by consistently using the same computer or the same
compression algorithm.

With some of the above in mind, let me try to answer
Glenn's objection. It turns out that one often includes
statistical aspects of a particular language as part of
a compression scheme. Things such as we all learn from
watching Wheel of Fortune. T, R, S and L occur more
commonly than Z and X say. It turns out that english
messages are typically more than 50% compressible based
only on these statistical considerations. It is with
this in mind that I mentioned spotting ABCDEF as an
orderly pattern, i.e. I was assuming that the alphabet
was part of the "hardware" and not part of the algorithm
itself. Otherwise, Glenn is correct.

Brian Harper
Associate Professor
Applied Mechanics
The Ohio State University

"... we have learned from much experience that all
philosophical intuitions about what nature is going
to do fail." -- Richard Feynman