Re: Specified Complexity was: Re: NTSE Conference papers

Stephen Jones (sejones@ibm.net)
Sat, 15 Feb 97 19:01:49 +0800

Group

On Mon, 10 Feb 1997 17:00:17 -0500, Brian D. Harper wrote:

[...]

BH>Thanks for this info Burgy. There are quite a few really
>interesting papers available. The one by Bill Dembski
>["Intelligent Design as a Theory of Information"] caught my eye as
>it involves some things that we were discussing here not long ago.
>In particular, there was a discussion about whether information
>could increase due to random mutations. This all depends on how
>information is defined.

Agreed.

BH>If defined a la Shannon or Kolmogorov then it seems pretty clear
>that an increase in information is expected due to random
>mutations.

Agreed. But this may not be what is meant by "information" in
biology, namely the assignment of specificied meaning:

"Some have attempted to develop a theory of meaning from these
ideas. Shannon (1949) warned against this at the outset of his
paper. The assignment of meaning or in biology, specificity, to
certain particular member of the ensemble lies outside information
theory." (Yockey H.P., "An Application of Information Theory to the
Central Dogma and the Sequence Hypothesis", Journal of Theoretical
Biology, 46, 1974, pp371-372)

Dembski says that "information" is not only "the transmission of
signals across a communication channel":

"The fundamental intuition underlying information is not, as is
commonly thought, the transmission of signals across a communication
channel, but rather, the ruling out of possibilities. To be sure,
when signals are transmitted across a communication channel,
invariably a set of possibilities is ruled out, namely, those signals
which were not transmitted. But to acquire information remains
fundamentally a matter of ruling out possibilities, whether these
possibilities comprise signals across a communication channel or take
some other form." (Dembski W.A., "Intelligent Design as a Theory of
Information", January 1997, Indiana, USA)

>BH>Steve Jones wanted to define information in terms of specified
>complexity but this definition never was particularly clear to me.

It wasn't *my* definition. That is how origin of life specialist
Orgel defined it:

"Living organisms are distinguished by their specified complexity.
Crystals fail to qualify as living because they lack complexity;
mixtures of random polymers fail to qualify because they lack
specificity." (Orgel L.E., "The Origins of Life", 1973, p189, in
Thaxton C.B., et al., "The Mystery of Life's Origin", 1992, p130)

Interestingly, it appears to be also Dawkins' definition, namely a
"quality, specifiable in advance":

"This has been quite a long, drawn-out argument, and it is time to
remind ourselves of how we got into it in the first place. We were
looking for a precise way to express what we mean when we refer to
something as complicated. We were trying to put a finger on what it
is that humans and moles and earthworms and airliners and watches
have in common with each other, but not with blancmange, or Mont
Blanc, or the moon. The answer we have arrived at is that
complicated things have some quality, specifiable in advance, that
is highly unlikely to have been acquired by random chance alone."
(Dawkins R., "The Blind Watchmaker", 1991, p9)

BH>The paper by Dembski (which I haven't read completely yet) tries
to >make the definition of specified complexity (he calls it CSI,
>complex specified information) more concrete and objective. Looks
>very interesting.

Indeed. Here is the abstract:

----------------------------------------------------------
Intelligent Design as a Theory of Information

William A. Dembski

For the scientific community Intelligent Design represents
creationism's latest grasp at scientific legitimacy. Accordingly,
Intelligent Design is viewed as yet another ill-conceived attempt by
creationists to straightjacket science within a religious ideology.
But in fact Intelligent Design can be formulated as a scientific
theory having empirical consequences and devoid of religious
commitments. Over the last seven years the mathematician Keith
Devlin, the philosopher David Chalmers, and the physicist David Bohm
have all formulated notions of information according to which
information constitutes a fundamental entity in the bio-physical
universe, on the same par as energy. This is not Claude Shannon's
information as carrying capacity of symbol-strings transmitted across
a communication channel (i.e. syntactic information). Nor is this
Robert Stalnaker's information as ruling out of possible worlds
(i.e., semantic information). Rather, this is a notion of functional
information, or what David Bohm calls "active information." This is
the information that confers function on complex systems, and
distinguishes complexity simpliciter from what David Berlinski and
Marcel Schutzenberger refer to as "functional complexity."
Intelligent Design can be unpacked as a theory of information. As
characterized in this theory, information becomes a proper object for
scientific investigation. In my paper I shall (1) show how
information can be reliably detected and measured, and (2) formulate
a conservation law that governs the origin and flow of information.
My broad conclusion is that in formation is not reducible to natural
laws, and that the origin of information is best sought in
intelligent causes. Intelligent Design thus becomes a theory for
detecting and measuring information, explaining its origin, and
historically tracing its flow.

12/26/96.

http://www.dla.utexas.edu/depts/philosophy/faculty/koons/ntse/abstracts/Dembski.html
----------------------------------------------------------

BH>In particular, he tries to develop a Law of
>Conservation of Information, which is exactly what I had suggested
>to Steve needs to be done.

Brian was actually requiring that *I* do it:

----------------------------------------------------------
Date: Sun, 01 Dec 1996 20:32:15 -0500
To: evolution@calvin.edu
From: "Brian D. Harper" <harper.10@osu.edu>
Subject: Re: design: purposeful or random?

[...]

I've done a little thinking on this since my initial question to
you and I've come with a suggestion that might be worth
pursuing. Note that this is just an intuitive guess on my
part (a non-expert) but it seems to me that some measure
of mutual information content probably will satisfy the condition
of not increasing due to a random mutation (the classical Shannon
information would increase). I'm also suspecting that mutual
information may also capture the basic ideas of irreducible
complexity with the added bonus of being objectively measurable.

I kind of like this idea, I wonder if someone has already thought
of it. Probably :). Let's suppose that some clever person comes
up with a measure of mutual information content and can show
that this information cannot increase due to a random mutation.
This would be only a small step towards providing any threat to
neo-Darwinism. One would still have to show that this information
measure doesn't increase due to the combination random mutation
plus selection mechanism. Good luck. It seems to me that one is
searching for some general principle that might be called the
"conservation of information". Stated this way it almost makes
me giggle out loud. OK, I confess, I did giggle a little as I wrote
that :). But who knows, maybe there is such a principle at least
in some narrower sense, i.e. "information is always conserved
for certain types of physical processes X, Y and Z".

OK, I've helped you out enough already ;-). When you discover
the CoI Principle and become famous, don't forget where you
got the idea ;-).
----------------------------------------------------------

The unreasonableness of Brian's requirement that I develop a theory
of "conservation of information" in order to support my claim that:

"...one thing that `random mutation' cannot do is to `create new
information'"

is seen by the fact that philosopher- mathematician Dembski is only
just now breaking new ground in developing such a theory himself.

[...]

>BH>One final comment. It seems that Steve Jones has made a valiant
>and commendable effort to catch up. As he seems to be
>concentrating on messages that have his name in them I figured I
>would mention his name to be sure he finds this. I think the
>Dembski paper will help him organize his arguments about
>organization so perhaps we'll have a more useful conversation.

I thank Brian for his commendation but I had already advised that I
am only responding to messages that have my name in them on 07 Jan
97, more than three weeks *before* Brian's `poll' of 30 Jan 97:

----------------------------------------------------------
From: "Stephen Jones" <sejones@ibm.net>
To: "evolution@Calvin.edu" <evolution@Calvin.edu>
Date: Tue, 07 Jan 97 20:47:13 +0800
Subject: Re: My Coming Out-Mediate Creation :-)

[...]

My new strategy will be to filter all mail with "Steve", "Stephen"
"SJ" and "Jones" into a separate in-basket. I will answer those
first before I look at the other mail in my in-basket. I will try to
post my replies at least weekly and ignore the rest. This means I
may not see anything posted to the Reflector without my name in it.
If you particularly want me to see it, you may have to cc. it to me.
----------------------------------------------------------

God bless.

Steve

-------------------------------------------------------------------
| Stephen E (Steve) Jones ,--_|\ sejones@ibm.net |
| 3 Hawker Avenue / Oz \ Steve.Jones@health.wa.gov.au |
| Warwick 6024 ->*_,--\_/ Phone +61 9 448 7439 (These are |
| Perth, West Australia v my opinions, not my employer's) |
-------------------------------------------------------------------