Re: design: purposeful or random?

Brian D. Harper (harper.10@osu.edu)
Sun, 01 Dec 1996 20:32:15 -0500

At 07:13 AM 11/26/96 +0800, Steve wrote:
>Group

;-)

>
>On Thu, 31 Oct 1996 15:23:55 -0500, Brian D. Harper wrote:
>
>[...]
>
>SJ>Sorry, but one thing that "random mutation" cannot do is to "create
>>new information":
>
>BH>How would one define "information" in such a way that a random
>>process would not result in an increase in information? The only
>>objective definitions of information that I know of are those found
>>in information theory. These information measures are maximal
>>for random processes.
>
>I am not by any stretch of the imagination an expert in "information theory",
>so I am unable to "define `information'" in such terms. I rely here on the
>expertise of others:
>

Steve, is it possible to respond a little more quickly? Its been almost
a month. It took me awhile to recapture my train of thought on this
business.

There were a couple of reasons for my challenge above. One was to
see if you had any understanding of the quotes you were giving
out. The other was a genuine curiosity about the answer to the
question I posed. As you are no doubt already aware, I'm not
particularly a fan of neo-Darwinism and if there is an information
theoretic argument against it then I'm certainly interested in
knowing about it. But hand waving and word games such as those
provided by W-S won't do.

I've done a little thinking on this since my initial question to
you and I've come with a suggestion that might be worth
pursuing. Note that this is just an intuitive guess on my
part (a non-expert) but it seems to me that some measure
of mutual information content probably will satisfy the condition
of not increasing due to a random mutation (the classical Shannon
information would increase). I'm also suspecting that mutual
information may also capture the basic ideas of irreducible
complexity with the added bonus of being objectively measurable.

I kind of like this idea, I wonder if someone has already thought
of it. Probably :). Let's suppose that some clever person comes
up with a measure of mutual information content and can show
that this information cannot increase due to a random mutation.
This would be only a small step towards providing any threat to
neo-Darwinism. One would still have to show that this information
measure doesn't increase due to the combination random mutation
plus selection mechanism. Good luck. It seems to me that one is
searching for some general principle that might be called the
"conservation of information". Stated this way it almost makes
me giggle out loud. OK, I confess, I did giggle a little as I wrote
that :). But who knows, maybe there is such a principle at least
in some narrower sense, i.e. "information is always conserved
for certain types of physical processes X, Y and Z".

OK, I've helped you out enough already ;-). When you discover
the CoI Principle and become famous, don't forget where you
got the idea ;-).

SJ:==
>
>But if Glenn or Brian has an example in the scientific literature of a
>"random mutation" that has "created new information", they could
>post a reference to it.
>

As soon as you define what you mean by information in some
objective, measurable way ... BTW, the following definition
won't do:

information: that quantity which doesn't increase due
to random mutations.

If by "information" you mean the measure in classical information
theory then your request is easily answered since every random
mutation to a system will increase its information content.
For example, a monkey typing randomly at a keyboard creates more
information than in a Shakespearean play containing the same number of
keystrokes.

[...]

>BH>"Darwinian transformism demands spontaneously increasing genetic
>>information. The information on the chromosomes of the primitive
>>cell must become greater for the primeval cell to become a human one.
>>Just as mere molecular movements are incapable of producing
>>information de novo (they can modify already existing information),
>>neither can they produce new information, as will be shown in the
>>text later. NeoDarwinian theory does not enlighten us as to how a
>>primeval cell can energetically finance the production of new
>>information, so that it becomes a higher plant or a higher animal
>>cell. Transformism demands a very large increase in information, the
>>principle behind which Neo-Darwinian thought is incapable of
>>explaining." (Wilder-Smith, A.E., "The Natural Sciences Know Nothing
>>of Evolution", T.W.F.T. Publishers: Costa Mesa CA, 1981, p.vi)
>
>BH>egad man, I wish you wouldn't give quotes like this. They give me
>>brain cramps. How much does it cost to "energetically finance
>>the production of new information"? Would it be, say, 10 Joules
>>per bit or what?
>
SJ:==
>I note Brian picks on the minor point in Wilder-Smith's argument, and
>parodies his terminology (not his content). I wonder if he has ever read
>W-S's full argument in his books? :-) W-S's major point is "Transformism
>demands a very large increase in information, the principle behind which
>Neo-Darwinian thought is incapable of explaining."
>

Of course, its really tough explaining a principle which is never
defined. Perhaps you could summarize this principle for us?
Please don't ask me to dig it out of W-S myself. The principle
should put some meat on the above statements. Given a particular
transformation how do I go about calculating the very large increase
in information? You claim I'm picking a minor point, however
it seems to me that the energetic financing of information
production plays a key role in this "principle". Thus you should
be able to calculate this cost in energy. I had hoped to show
how ludicrous such a principle is by observing what the units
of such a measure would be [J/bit]. If you think about it a bit
I think you'll see that such an idea is reductionistic to the point
of absurdity. Would the information in DNA be reducible to the
energy contained in chemical bonds? Does this information have
anything at all to do with chemical energy? Would the information in
Mike Behe's book be reducible to the energy of adhesion between
ink and paper?

to be continued later as (if) time permits ........

Brian Harper | "If you don't understand
Associate Professor | something and want to
Applied Mechanics | sound profound, use the
The Ohio State University | word 'entropy'"
| -- Morrowitz
Bastion for the naturalistic |
rulers of science |