Re: design: purposeful or random?

Stephen Jones (sejones@ibm.net)
Sun, 09 Feb 97 22:33:54 +0800

Group

On Wed, 15 Jan 1997 15:49:24 -0500, Brian D. Harper wrote:

[...]

>SJ>Sorry, but one thing that "random mutation" cannot do is to "create
>new information":
>
>BH>How would one define "information" in such a way that a random
>process would not result in an increase in information? The only
>objective definitions of information that I know of are those found
>in information theory. These information measures are maximal
>for random processes.

>SJ>I am not by any stretch of the imagination an expert in
>"information theory", so I am unable to "define `information'" in
>such terms. I rely here on the expertise of others:
>
>BH>Steve, is it possible to respond a little more quickly? Its been
>almost a month. It took me awhile to recapture my train of thought
>on this business.
>
>SJ>Sorry to Brian and others, but I had already warned of this in
>advance. I have a new job and have less time to respond, so I my
>replies will be in a batch, several weeks behind. If people think
>this is too late, then they should feel free to ignore my posts.
>This will then be self-correcting - fewer replies and I will catch
>up! :-) But if it looks like I am getting further and further
>behind, I will unsubscribe for a while until I catch up.

BH>I don't mean this to be negative, my suggestion is that you just
>let slide stuff more than a week or two old and get caught up.
>You'll be much more effective this way.

OK. I might do that.

>BH>There were a couple of reasons for my challenge above. One was to
>see if you had any understanding of the quotes you were giving
>out. The other was a genuine curiosity about the answer to the
>question I posed. As you are no doubt already aware, I'm not
>particularly a fan of neo-Darwinism and if there is an information
>theoretic argument against it then I'm certainly interested in
>knowing about it. But hand waving and word games such as those
>provided by W-S won't do.

>SJ>I was responding to Brian's specific request that I define
>"information" in terms of "information theory":

BH>You misunderstood my request. You are free to define
>information any way you wish [except, of course, something
>like "that quantity which does not increase due to a random
>mutation" ]. I merely mentioned that the only objective definitions
>I know about come from information theory (classical or algorithmic).

[...]

SJ>My point was not that I cannot define "information" but that I
>cannot define it "in `information theory'...terms". I understand
>what "information" is as described by scientific writers in books
>addressed to laymen like myself, ie. as "specified complexity":

BH>One problem is that "information" can mean all sorts of different
>things in books written for laymen. Its very confusing sometimes
>figuring out just what is meant by a particular author. But I
>think "specified complexity" corresponds fairly well to the
>meaning of "information" in algorithmic information theory.

> http://www.research.ibm.com/people/c/chaitin
> http://www.research.ibm.com/people/c/chaitin/inv.html

Thanks to Brian for the above. I will look them up some time. But I
think I will stick to laymen's definitions like "specified
complexity".

>SJ>"Information in this context means the precise determination, or
specification, of a sequence of letters. We said above that a
>message represents `specified complexity.' We are now able to
>understand what specified means. The more highly specified a thing
>is, the fewer choices there are about fulfilling each instruction.
>In a random situation, options are unlimited and each option is
>equally probable." (Bradley W.L. & Thaxton C.B., in Moreland J.P.
>ed., "The Creation Hypothesis", 1994, p207)

BH>Oops.

What is the "Oops" about?

SJ>In generating a list of random letters, for instance, there are
>no constraints on the choice of letters at each step. The letters
>are unspecified. On the other hand, an ordered structure like our
>book full of `I love you' is highly specified but redundant and not
>complex, though each letter is specified. It has a low information
>content, as noted before, because the instructions needed to specify
>it are few. Ordered structures and random structures are similar in
>that both have a low information content. They differ in that
>ordered structures are highly specified and random structures are
>unspecified. A complex structure like a poem is likewise highly
>specified. It differs from an ordered structure, however, in that
>it not only is highly specified but also has a high information
>content. Writing a poem requires new instructions to specify each
>letter. To sum up, information theory has given us tools to
>distinguish between the two kinds of order we distinguished at the
>beginning. Lack of order - randomness - is neither specified nor
>high in information. The first kind of order is the kind found in a
>snowflake. Using the terms of information theory, a snowflake is
>specified but has a low information content. Its order arises from
>a single structure repeated over and over. It is like the book
>filled with `I love you.' The second kind of order, the kind found
>in the faces on Mount Rushmore, is both specified and high in
>information. Molecules characterized by specified complexity make
>up living things. These molecules are, most notably, DNA and
>protein. By contrast, nonliving natural things fall into one of two
>categories. They are either unspecified and random (lumps of
>granite and mixtures of random nucleotides) or specified but simple
>(snowflakes and crystals). A crystal fails to qualify as living
>because it lacks complexity. A chain of random nucleotides fails to
>qualify because it lacks complexity. A chain of random nucleotides
>fails to qualify because it lacks specificity. No nonliving things
>(except DNA and protein in living things, human artifacts and
>written language) have specified complexity." (Bradley. & Thaxton,
>1994, pp207-208)

BH>I can agree with quite a bit of what is written above. What B&T
>are calling "specified complexity" I would call "organized
>complexity". I would disagree with trying to relate this directly
>to any objective measure of information content. I also don't
>particularly like their definition of random situation.

I am not sure that what Brian would call "organized complexity"
Bradley & Thaxton would call "specified complexity". Perhaps Brian
can clarify this?

BH>Briefly, the algorithmic info content (or Kolmogorov complexity)
>can be thought of roughly in terms of "descriptive length". The
>longer the description of an object, the greater its complexity.
>Of course, one is talking here of the length of the shortest
>description so that B&T's "I love you" book above could be described
>"I love you" repeat 4000 times. The descriptive length is small so
>the complexity of this message is small.

Agreed.

BH>The reason I thought at first that "specified complexity"
>corresponded roughly to Kolmogorov complexity is that I was thinking
>in terms how long it takes to specify an object.
>
>Now, let me illustrate why the descriptive complexity (algorithmic
>information content) is generally expected to increase due to a
>random mutation. First we consider the following message written
>in our alphabet with 26 letters:
>
>AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA ...............
>
>Now we introduce a random mutation anywhere, say:
>
>AAAAAAAAAAAAAXAAAAAAAAAAAAAAAAAAAAAAAAAA................
>
>The first sequence has a small descriptive length:
>
>AA repeat
>
>the second has a much longer descriptive length:
>
>AAAAAAAAAAAAAXAA repeat A

This is *not* an example of increasing the information of an already
specified complex string. The string of AAAAs has zero information
content, so anything would be an improvement! But this has *no*
analogy with a living system. There may indeed have been an increase
in "algorithmic information content" by a "random mutation" but I
cannot see that it has "created" any "new information", in the sense
that I am using it, ie. on the analogy of an English sentence like
"John Loves Mary". Thaxton, Bradley & Olsen illustrate:

"Three sets of letter arrangements show nicely the difference between
order and complexity in relation to information:

1. An ordered (periodic) and therefore specified arrangement:

THE END THE END THE END THE END*

Example: Nylon, or a crystal.

2. A complex (aperiodic) and therefore specified arrangement:

AGDCBFE GBCAFED ACEDFBG

3. complex (aperiodic) unspecified arrangement::

THIS SEQUENCE OF LETTERS CONTAINS A MESSAGE

Example: DNA, protein.

Yockey and Wickens develop the same distinction, explaining
that "order" is a statistical concept referring to regularity such as
might characterize a series of digits in a number, or the ions of an
inorganic crystal. On the other hand, "organization" refers to physical
systems and the specific set of spatio-temporal and functional
relationships among their parts. Yockey and Wickens note that
informational macromolecules have a low degree of order but a high
degree of specified complexity. In short, the redundant order of
crystals cannot give rise to specified complexity of the kind or
magnitude found in biological organization; attempts to relate the two
have little future." (Thaxton C.B., Bradley W.L. & Olsen R.L., "The
Mystery of Life's Origin" 1992, p130)

Maybe Brian can do the above with a *real* English sentence, like
THIS SEQUENCE OF LETTERS CONTAINS A MESSAGE?

[...]

SJ>But if Glenn or Brian has an example in the scientific literature
>of a "random mutation" that has "created new information", they
>could post a reference to it.

BH>You have an example above. You can find another example in the
>pure chance thread.

The above is not an "example" at all. And I find it strange that I am
referred to a web site. I do not regard web sites as "the scientific
literature".

>BH>As soon as you define what you mean by information in some
>objective, measurable way ...

>SJ>Brian seems to have forgotten, but it was *Glenn* who was the
>person claiming that "random mutation" could "create new
>information":

BH>and you replied

SJ>'Sorry, but one thing that "random mutation" cannot do is to
>"create new information" ' --SJ

OK. I stand corrected. But Glenn has not backed up his claim either.
My post is a challenge to him to do so.

BH>You then tried to support this by giving some of WS's ideas
>which you now refuse to defend.

No. I am refusing to defend *Brian*'s dstipulation that the
"information" be defined *on his terms* "in some objective,
measurable way", ie. using information theory, which I am not
unfamiliar with, and from what I have seen, I doubt is relevant.

I await Brian or Glenn giving me an example in the real world, or by
analogy with an English sentence. At the Southern Methodist
University Symposium, attended by leading Darwinists like Ruse, A.
Shapiro and F. Grinnell, Johnson opened with:

"Grasse argued that, due to their uncompromising commitment to
materialism, the Darwinists who dominate evolutionary biology have
failed to define properly the problem they were trying to solve. The
real problem of evolution is to account for the origin of new genetic
information, and it is not solved by providing illustrations of the
acknowledged capacity of an existing genotype to vary within limits
Darwinists had imposed upon evolutionary theory the dogmatic
proposition that variation and innovative evolution are the same
process and then had employed a systematic bias in the interpretation
of evidence to support the dogma." (Johnson P.E., "Darwinism's Rules
of Reasoning", in Buell J. & Hearn V., eds., "Darwinism: Science or
Philosophy?, 1994, p6)

None of the high-powered Darwinists present answered the above, so I
assume it was valid.

SJ>All I did was deny that "random mutation" can "create new
>information".

BH>And all I did was ask you to provide some justification for
>your denial. BTW, you did more than just deny this, you
>also quoted WS thinking that that supprted your denial.

No. Brian asked me to "provide some justification" *in terms of
information theory*. I cannot do this, and in any event I made no
claim about information theory. My original request was in terms of
biology:

--------------------------------------------------------
On Sun, 06 Oct 1996 14:44:29, Glenn Morton wrote:

GM>Which is a better design technique, rational design or random
>evolution? Creationists often cite the supposed inability of random
>mutation to create new information and its inability to perform
>better than a human designer.

Sorry, but one thing that "random mutation" cannot do is to "create
new information":

"Darwinian transformism demands spontaneously increasing genetic
information. The information on the chromosomes of the primitive
cell must become greater for the primeval cell to become a human one.
Just as mere molecular movements are incapable of producing
information de novo (they can modify already existing information),
neither can they produce new information, as will be shown in the
text later. NeoDarwinian theory does not enlighten us as to how a
primeval cell can energetically finance the production of new
information, so that it becomes a higher plant or a higher animal
cell. Transformism demands a very large increase in information, the
principle behind which Neo-Darwinian thought is incapable of
explaining." (Wilder-Smith, A.E., "The Natural Sciences Know Nothing
of Evolution", T.W.F.T. Publishers: Costa Mesa CA, 1981, p.vi)
--------------------------------------------------------

The example that Brian posted from "information theory" was not
relevant to the above.

[...]

>SJ>Since it is impossible to prove a universal negative, I cannot
>prove my denial is true. But Glenn can prove me wrong by citing
>examples* where "random mutation" can indeed "create new
>information".

BH>Been there, done that.

No. All Brian has done is show that:

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

can be changed by "random mutation" to:

AAAAAAAAAAAAAXAAAAAAAAAAAAAAAAAAAAAAAAAA

and then claim that because "The first sequence has a small
descriptive length", and "the second has a much longer descriptive
length" that there has been an increase in information! In the
nature of the case, almost any change to a string of the same letters
would produce greater complexity, and therefore make it harder to
describe. And if making it harder to describe is defined as
information, then voila! there has been an increase in information!

But if information is defined as *specified complexity* then there
has been no change, in information because in both cases it hasn't
any! Brian's example is therefore totally irrelevant to living
systems.

>SJ>*It is theoretically possible for a monkey at a typewriter to
>randomly type "John loves Mary.", thus randomly creating new
>information. But firstly, the possibility of this is astronomical
>... And second, the monkey would not be aware that this *was* new
>information, so even if he typed it he wouldn't conserve it.

SJ>I take Brian (and Glenn's) failure to post "an example in the
>scientific literature of a `random mutation' that has `created new
>information', as evdience that there is no such example.

BH>Steve, do realize how illogical this statement is? First, neither
>Glenn or I are experts in this field. More importantly, there
>is no causal connection between what we post here and what
>is contained in the literature. egad man, think about what you're
>saying.

Brian claims he is not an "expert in this field." yet he has
corresponded on talk.origins with Yockey and other "experts in this
field". He could get an example from the "experts in this field"
within a day, if there was such an example.. If his "example" of a
string of 40 "A"'s mutating to 13 "A"'s + 1 "X" + 26 "A"'s
is the best that he can do, I take it to be the best that the
"experts in this field" can do.

BH>In spite of this, an example has been given.

See above. If that's Brian's best "example", I rest my case!

>BH>If by "information" you mean the measure in classical information
>theory then your request is easily answered since every random
>mutation to a system will increase its information content. For
>example, a monkey typing randomly at a keyboard creates more
>information than in a Shakespearean play containing the same number
>of keystrokes.

SJ>No, I have defined it as "specified complexity",

BH>Err, Steve, you need to learn the difference between past and
>present. You introduced "specified complexity" in this post, not
>the one I was responding to above. That's why I asked you to define
>"information".

See above. I have defined "information". My original post was in
respect to something Glenn wrote. Then Brian chipped in trying to
get me to define "information" in terms of information theory, but I
told him that I defined it as "specified complexity".

>SJ>the analogy of a grammatically correct English sentence, eg.
>"John loves Mary". Brian's contrast between "a monkey typing
>randomly at a keyboard" and "a Shakespearean play containing the
>same number of keystrokes" indicates that he an I share the same
>definition of what "information" means.

BH>Not quite. I'm afraid you misunderstood my example of "a monkey
>typing randomly at a keyboard". I was referring to a typical sequence
>produced by a monkey rather than a highly improbable one, i.e.

>podgoihafauiodhfennonvmsphodjgoiheiuqkrqpeoijnvbmvxvzs.sdkfo
>aehyqtuqbenmsppgnlpfjkhgosihuiafnauifhquirqemgopkgpiojdiuaus
>ygwqhwoiqandkjvbayusgquwqejo,,vlsdjkhfyuegwerjyioertksrioueryt
>wriotiudnjbcnbhjduihgertopwemlkdnmfxnvolp;kpqaiuqsnklznxiknf
>asiodhasuesfnblm;z;x,djsmgklehuiankflnkjnszkljdksjdnkazjksdbg
>uiaoiehyuiqwehioerjkhsiofjsdhioasjoqwhpqkeppoejiohxcmoxcbnx
>cbohsdognsouihxuiojiowhrpkfolxncohsiodjauiodioszhuiasouiqh
>wuiasuifsfbiauiujkauiqwoiqohaisklzml;vmnpckpcvbjoidfguiwehriqa
>nfkjabcjkabsdiuqwgbqjnsdnksjdgisdnoshfinjweopqhjrejgodhnashu
>iqweowheinaosdnghoeoqihiabsfisudbvsdnfopiqwhwrojusdhfgpgjml
>vlzsmvl;sjgosjefuiehokwngijndvuionovihifnkdnbvjklsdbiqawnioqab
>ifqabifndohasnfoqaehgsffhgposajuioqgwyubajkbdfyuwfyuqweijrfoph
>mjklsnajkbjklawyuqiotjroijkdvkldnfiouqwjproqjeoglnsdskldnoiehto
>qengasdngolasdngoianioasnioaenfiouqwehuiweioejtrioueirofpognp
>omg.,mznjvuitruqwhsnbdjkfzbkjvauifbaibfuiaygqwiejtpoerjkpodfgm
>mbosdngwsenuifgeuifyuiytfueosuidhiofbidbabfiuasdiuwehqtwjros
>dlvnsoidjgiuwrjqpwoejgas -- the MonkMeister
>
>This sequence has a higher information content than a passage
>from Shakespeare of the same length.
>
>Oooh, I know you're going to like this ;-).

I do "like it". f that is the case, then it shows that the
"information" in "information theory" and the "information" that I
am talking about are entirely different things.

BH>Seriously, though, if you want to understand better what is being
>measured with Kolmogorov complexity and why its a good
>measure of information content, I just thought of another
>good analogy to it that will let us easily calculate some rough
>numbers for information content, namely compression algorithms
>like pkzip or gzip.

Yes. I remember Brian posting a quiz about it a year or two ago. I
found that very helpful. But if "Kolmogorov complexity" claims that
a monkey at a typewriter typing gibberish "has a higher information
content than a passage from Shakespeare of the same length" then it
shows that what information theory means by "information" and what I
mean by it are two very, very different things.

[...]

>SJ>I note Brian picks on the minor point in Wilder-Smith's argument,
>and parodies his terminology (not his content). I wonder if he has
>ever read W-S's full argument in his books? :-) W-S's major point
>is "Transformism demands a very large increase in information, the
>principle behind which Neo-Darwinian thought is incapable of
>explaining."

>BH>Of course, its really tough explaining a principle which is never
>defined. Perhaps you could summarize this principle for us?
>Please don't ask me to dig it out of W-S myself.

Since we cannot even agree what "information" is, I think it would be
a complete waste of time! :-)

>SJ>I take this as Brian's way of admitting he has not read
>Wilder-Smith? I suggest he does read him, rather than rely on my
>meagre powers of summarisation. IMHO W-S was a genius. He refutes
>Kenyon's Biochemical Predestination, and I would not be at all
>surprised if Kenyon's conversion to creationism was due to reading
>W-S...Can there be anything as amazing as this? A leading
>evolutionist Professor of Biology reads a refutation of his life's
>work in a creationist book and becomes a creationist? I would urge
>Brian and others to read Wilder-Smith's "The Creation of Life" for
>themselves.

>BH>The principle should put some meat on the above statements.
>Given a particular transformation how do I go about calculating the
>very large increase in information? You claim I'm picking a minor
>point, however it seems to me that the energetic financing of
>information production plays a key role in this "principle". Thus
>you should be able to calculate this cost in energy. I had hoped
>to show how ludicrous such a principle is by observing what the
>units of such a measure would be [J/bit]. If you think about it a
>bit I think you'll see that such an idea is reductionistic to the
>point of absurdity. Would the information in DNA be reducible to
>the energy contained in chemical bonds? Does this information have
>anything at all to do with chemical energy? Would the information
>in Mike Behe's book be reducible to the energy of adhesion between
>ink and paper?

Brian should read Wilder-Smith for himself. I doubt if I could do
justice to the information theory parts of his argument. I would need
to read Wilder-Smith again and I haven't the time to do it at
present. Maybe I'll do it later.

[...]

>SJ>I could spend a lot of time answering this, but somehow, on
>Brian's track record with my posts, I don't think he really wants to
>know! :-)

BH>Given my detailed response to you I find this statement truly
>amazing. I generally spend a considerable amount of time on my
>posts to the reflector, including my responses to you. The reason
>it takes time is that I'm giving my own ideas rather than just
>dumping a bag full of quotes. Let me suggest to you that if you
>have no intention of defending the ideas in this quotation ad
>absurdum campaign then don't present them.

I find Brian's "detailed responses" mostly as a series of traps. He
redefines what I say and then shoots it down.

BH>BTW, I'm getting sick of your policy of insulting someone however
>you wish and then following it with a :-). I suspect I know why you
>do it, so you can say whatever you want without taking any
>responsibility for it.

No. I take *full* "responsibility" for what I say and I never
consciously "insult" anyone. If Brian can point out where I insulted
him personally I will unreservedly apologise. As for my :-), that is
standard internet practice for saying hard things with a smile and
I will continue to do it.

BH>another BTW, I suspect what you really meant to say was given
>my track record of shooting you down, you dare not actually say
>anything.

Brian flatters himself. I am not imtimidated in the slightest by his
*attempts* at "shooting" me "down". I thank him for it, since it
strengthens my understanding of the strengths and weaknesses of
evolution.

>SJ>Creationists on this Reflector are always being urged to read
>the literature for themselves. If Brian really is interested in
>what Wilder-Smith means by "energetically finance the production of
>new information" he will read the latter's books for himself.

BH>Translation: Steve doesn't know what WS means either.

On that particular point, I freely admit I don't - it was not my main
point. And I haven't got the time to re-read Wilder-Smith to find it out
just for Brian, either. Since we don't even agree on what is
"information", it would probably be a huge waste of time.

God bless.

Steve

-------------------------------------------------------------------
| Stephen E (Steve) Jones ,--_|\ sejones@ibm.net |
| 3 Hawker Avenue / Oz \ Steve.Jones@health.wa.gov.au |
| Warwick 6024 ->*_,--\_/ Phone +61 9 448 7439 (These are |
| Perth, West Australia v my opinions, not my employer's) |
-------------------------------------------------------------------