RE: What does ID mean?

Paul A. Nelson (pnelson2@ix.netcom.com)
Thu, 16 Apr 1998 14:42:10 -0500 (CDT)

Gary wrote:

>This has bothered me for some time. My understanding is that the Shannon
>entropy of a highly complex sequence, one of high information content, will
>approach that of a random sequence. The difference between a highly
>complex sequwence and a random one is meaning. But meaning is conveyed by
>language. So it seems that if we do not know the language, we will never
>know whether the message we have received is static or significant.

In biology, the relevant language is function. Not any sequence of
amino acids will give an oxygen-carrying molecule (e.g., hemoglobin
or myoglobin).

>Is there any way in principle--apart from the semantics of a particular
>language--to distinguish a sequence of high information content from random
>noise? If not, as James intimates, how can we distinguish intelligent
>production from chance?

I would turn your point around. The fact that we CAN distinguish
intelligent production from chance, even where no language is present --
what, for instance, is the language of Stonehenge? -- means that
we need to be more clever in understanding the logical and epistemic
structure of the design inferences we already make.

Friends, gotta bail out here. I'll be off-line indefinitely.

Paul.