thermodynamics and evolution

Brian D Harper (harper.10@osu.edu)
Mon, 14 Apr 1997 23:42:22 -0400

Several days ago, Steve Schimmrich gave an address to
a web site related to the SCICHR list:

http://www.students.uiuc.edu/~s-schim/scichr/scichr.html

Browsing around I found an excellent article on thermodynamics
and evolution by Allan H. Harvey:

<The Second Law of Thermodynamics in the Context of the
Christian Faith>

http://www.students.uiuc.edu/~s-schim/scichr/essays/thermo.html

I would recommend people read this entire article, however,
I'm going to deal here only with issues related to information
theory and complexity.

Following is an excerpt from Harvey's document:

==========================================================
<What About Information Theory?
<
<Flaw #2 above is sometimes attacked by referring to
<information theory, which contains a quantity called
<"entropy." While I am no expert in information theory,
<I can say enough to deal with that particular argument.
<
<As a preliminary, we must talk about the definition of
<entropy from statistical physics. This definition is
<mostly due to Boltzmann, and is even engraved on his
<tombstone. Boltzmann defined the entropy of a system
<in terms of the number of different states available to
<it. So, for example, the expansion of a gas into double
<its original volume at constant temperature would
<represent an increase in entropy, because each molecule
<would have twice as much volume (and therefore twice as
<many "states") accessible to it. It is this definition
<that causes entropy to be thought of in terms of "disorder,"
<because a highly ordered system like a crystal has fewer
<available states. While it is an exaggeration to say that
<Boltzmann's identification of this quantity with the
<thermodynamic entropy has been "proven," it is universally
<accepted.
<
<More recently, a field has arisen called information
<theory. This deals with, among other things, quantifying the
<"information content" of various systems. Some of the
<results of information theory resemble the results of
<statistical physics, so much so that in certain well-defined
<conditions a quantity can be defined that is labeled
<"entropy" and that obeys something analogous to the 2nd law.
<While the identification of the information entropy with
<its thermodynamic counterpart is controversial, it is
<plausible enough to be taken seriously.
<
<So some creationists, recognizing that their argument does
<not apply to the thermodynamic entropy, assert that it does
<make sense in terms of the information entropy. This is
<because information theory talks about things more directly
<related to "complexity" and "disorder." But Flaw #2 above
<(in addition to Flaws #1 and #3) applies equally to the
<information entropy. If the 2nd law is to be applicable at
<all in this context, we must be able to make the rigorous
<definitions of information content required by the theory.
<But, just as we cannot measure the thermodynamic entropy of
<a person or of the Earth, we cannot begin to quantify the
<"information content" either. Whatever definition of entropy
<we use, we simply don't have enough information (no pun
<intended) to apply 2nd-law analysis in any sensible way to
<the question of the development of life on Earth.
==========================================================

I'm not particularly comfortable with the way the author
worded the last paragraph. The implication seems to be
that creationists are kind of flailing about desperately
looking for something, anything, as an argument. If classical
thermo doesn't work, let's try information theory. I think
it's fairer to say that the issues that seem to concern
creationists are really more appropriately formulated in
terms of information theory and complexity theory.

So, the question I want to address is whether the entropy
argument works in the context of information and complexity
theory.

I think a lot of the "trouble" here boils down to something
we've discussed here before, namely that words often time
mean something entirely different when used in some technical
field and also may mean something entirely different in
two different fields.

Earlier in the article referred to above, the author gave
the following summary of the thermodynamic argument against
evolution:

=======================================================
<The Second Law and Creation
<
<Now we address the context in which the 2nd law arises in
<creation arguments. The usual argument goes something like
<this: "The 2nd law says everything tends toward increasing
<entropy (randomness and disorder). But the evolution of life
<involves the development of great complexity and order.
<Therefore, evolution is impossible by the 2nd law of
<thermodynamics." While it sounds simple, there are major
<flaws in this argument that render it worthless.
=======================================================

The critical phrase here is "But the evolution of life
involves the development of great complexity and order."
My intent is not to blame creationists for faulty
terminology. Look in any biology book and I'm sure
you'll find the phrase "biological order and complexity"
or something like it. I'm sure I could easily find a
dozen articles from the main stream literature that
use the terms biological order and biological complexity
as synonyms. In this context, I don't think there is
too much confusion.

The real problems start when one carries this terminology
over into complexity and information theory. In these
fields, the terms order and complexity have precise meanings
and are, in fact, opposites. Ordered complexity is an
oxymoron. It is correct to say:

"But the evolution of life involves the
development of great complexity"

It is incorrect to say:

"But the evolution of life involves the
development of great order"

Now we note that the terms compexity, information and
entropy are used interchangebly in information and
complexity theory. They mean the same thing, so:

(1) But the evolution of life involves a
great increase in complexity

(2) But the evolution of life involves a
great increase in information

(3) But the evolution of life involves a
great increase in entropy

are just three ways of saying the same thing.

Of course, statement #3 seems rather odd as part of
an entropy argument against evolution. But this is
the conclusion one comes to when one is careful with
words.

Brian Harper
Associate Professor
Applied Mechanics
Ohio State University
"Aw, Wilbur" -- Mr. Ed