Re: Defining GOG

Glenn Morton (grmorton@waymark.net)
Wed, 17 Dec 1997 15:50:09 -0600

Hi George,

At 09:01 AM 12/17/97 -0600, George Andrews wrote:

>This is precisely the problem with ID. Evolution theory may (can?) account for
>information generation via complexity notions which, therefore, fill the design
>or information gap. This information gap, as I understand, is the thrust of the
>ID argument.

And I have pointed out numerous times (without response) to individual
members of the ID gang that they are using the term "information"
incorrectly. They are using it as they would the term 'meaning'. Meaning is
something that cannot be measured and depends upon an agreement between two
individuals. A series of characters on a piece of paper may appear
meaningless to me, but if two spies have agreed on a code, then it has
meaning to them. Information in information theory is quite concisely
defined. And a random sequence of characters, generated by a random number
generator, has as much information as does the sentences I am writing to
you. In fact there is absolutely NO way to tell two sequences apart even if
one is made by a random Markov process and one typed by me. To back this up,
here are a copy of Yockey quotes. In Yockey's terminology a random sequence
is what you would expect, a highly organized sequence is a sequence made by
a process like a highly organize organism.

"Organisms are often characterized as being 'highly ordered' and in the same
paragraph as being 'highly organized'. Clearly these terms have opposite
meaings in the context of this chapter. The first message discussed in
section 2.4.1 is highly ordered and has a low entropy.[the first message is
'0101010101010101010101' the second message a higher organized one is
'0110110011011110001000' --GRM] Being 'highly organized' means that a long
algorithm is needed to describe the sequence and therefore higly organized
systems have a large entropy. Therefore highly ordered systems and highly
organized ones occupy opposite ends of the entropy scale and must not be
confused. Since highly organized systems have a high entropy, they are found
embedded among the random sequences that occupy the high end of the entorpy
scale."
"Kolmogorov (1965, 1968) and Chaitin (1966, 1969) have called the
entropy of the shortest algorithm. needed to compute a sequence its
complexity. Chaitin (1975b) proposed a definition of complexity that that
has the formal properties of the entropy concept in information theory.
Chaitin (1970, 1979) and Yockey (1974, 1977c) pointed out the applicability of
this concept in establishing a measure of the complexity or the information
content of the genome.
...
"Thus both random sequences and highly organized sequences are complex
because a long algorithm is needed to describe each one. Information theory
shows that it is fundamentally undecidable whether a given sequence has been
generated by a stochastic process or by a highly organized process. This is
in contrast with the classical law of the excluded middle (tertium non datur),
that is, the doctrine that a statement or theorem must be either true or
false. Algorithmic information theory shows that truth or validity may also
be indeterminate or fundamentally undecidable."~Hubert Yockey, Information
Theory and Molecular Systems, (Cambridge: Cambridge University Press, 1992),
p. 81-82.

Creationists have it totally wrong when they say that life is ordered.
Crystals are ordered; life is complex and highly organized. There is a big,
big difference. It is also fundamentally impossible to determine whether a
sequence made by a highly organized process like life, is different from a
sequence generated by random processes. THUS ONE CANNOT USE THE HIGHLY
ORGANIZED NATURE OF LIFE AS EVIDENCE OF DESIGN. THE COMPLEXITY OF LIFE MIGHT
BE DUE TO A RANDOM PROCESS.

glenn

Adam, Apes, and Anthropology: Finding the Soul of Fossil Man

and

Foundation, Fall and Flood
http://www.isource.net/~grmorton/dmd.htm