Re: 2nd Law of Thermodynamics

Brian D Harper (bharper@postbox.acs.ohio-state.edu)
Tue, 20 Jan 1998 10:27:37 -0500

At 09:31 PM 1/19/98 -0800, Greg wrote to Ron:

>
> Why are you so reluctant to answer the original question, though? Have
> you decided that speciation isn't really what you mean by 'macroevolution'?
>
> (Here it is as a reminder: What is more orderly about two groups of
> organisms which can't interbreed as opposed to two groups which can?)
>

Hopefully Greg will not mind if I butt in here :) as
this question is I think a good illustration of a
point I was trying to make earlier. Armed only with
the words order and disorder how is one to decide
which of these is more orderly. I'm going to say
that the two groups which cannot interbreed is
less orderly since this represents a case of greater
diversity and that order decreases as diversity
increases. Ron might say the opposite and herein
lies the problem with the word game approach. What
appears orderly to one person may appear disorderly
to another.

So, let's try to make it more precise. In discussions about
evolution one generally hears about only one or at most two
entropies, the Maxwell-Boltzmann-Gibbs entropy of statistical
thermodynamics and the Shannon entropy of classical
information theory. It turns out that there are many
entropies, more than you can shake a stick at. In a
discussion on bionet.info-theory, one of the regulars
did a quick literature search and found literally hundreds
of entropies.

Now to the point, Tom Ray uses an entropy to provide
an objective measure of diversity in his Tierra World
simulation of Darwinian evolution:
====================================================
Ray, T. S. 1994. Evolution, complexity, entropy, and
artificial reality. <Physica D> 75: 239-263.

[http://www.hip.atr.co.jp/~ray/pubs/oji/ojihtml.html]

Abstract:

The process of Darwinian evolution by natural selection
was inoculated into four artificial worlds (virtual
computers). These systems were used for a comparative
study of the rates, degrees and patterns of evolutionary
optimizations, showing that many features of the
evolutionary process are sensitive to the structure of the
underlying genetic language. Some specific examples of the
evolution of increasingly complex structures are described.
In addition a measure of entropy (diversity) of the evolving
ecological community over time was used to study the
relationship between evolution and entropy.
===========================================

The entropy referred to above has the same form as the
thermodynamic and Shannon entropies, i.e. it is the
negative sum of p log(p). In this case p refers to
the proportion of the total community occupied by
each genotype. Ray points out that this entropy is
a measure of the diversity of the community. To explain
why this is the case, consider first the case where
there are 100 genotypes but only one of these occurs
say 80% of the time. This would correspond to low
diversity and also, according to the formula
- sum [p log(p)], to low entropy. The case of highest
diversity, and also highest entropy, would be where
all possibilities occurred at the same frequency.

Thus, increasing diversity leads to increasing entropy.

So, here we have scientific proof that diversity leads
to disorder. This is based on one of the most fundamental
laws of science, the second law of thermodynamics.
Yet, our science-ignorant dilbert politicians keep
insisting on increasing the diversity of our society.
It is no wonder our schools are in such bad shape and
that crime is rampant in our streets.

:-(|)

Brian Harper
Applied Mechanics
Ohio State University
214 Boyd Lab
155 W. Woodruff Ave
Columbus, OH 43210

"All kinds of private metaphysics and theology have
grown like weeds in the garden of thermodynamics"
-- E. H. Hiebert,