Re: How did we get that way?

Brian D. Harper (bharper@magnus.acs.ohio-state.edu)
Fri, 19 Apr 1996 15:41:30 -0400

At 08:27 AM 4/19/96 -0600, Denis wrote:

>
>So, being fully aware that this thermodynamics problem is way out of my
>league, is it possible to offer a cartoon on what this second law stuff is
>all about for the "mathematically unclean" like me? Or is it such
>that these categories are not "translatable" into understandable
>layperson terms? And if that is the case, that's OK. Merci.
>

Translating something into layman's terms is something I always
have trouble doing. In my own teaching I have always found
that teaching the most elementary concepts is the most difficult.
After years of familiarity, some things become obvious and its
easy to lose sight of the fact that it is not obvious to someone
who hasn't spent years studying the subject. Just a couple of
weeks ago an undergrad argued vehemently with me for about 30
minutes regarding whether one could uniquely solve two linear
algebraic equations in two unknowns. At first I just stared at
him in disbelief. He probably thought my silence was in tribute
to the briliance of his arguments ;-). I told him that if he wanted
to pass the course (Statics) he better learn how to solve two
equations for two unknowns. Perhaps he now considers me one of
those evil High Priests ;-).

I don't think I really want to try giving you the answer you
really want. This can be approached in several ways. The best
refutation of the entropy argument that I have seen is Hubert
Yockey's. Extracts of this are in the file Yockey#11 that
I posted some time back. It should be in the archives, if
you can't find it let me know and I can send you a copy.
This is not the traditional answer but I think it is the
best and clearest. And besides, you'll get a bonus prize
of seeing Gish "hoist by his own petard" ;-).

Let me try to approach this from the side. In another thread
Tom Moore made the very important point that popular level
or secondary sources can contain watered down treatments of
subjects that, if one is not careful, can lead one to erroneous
conclusions. I had intended to give thermo and information
theory as two more examples of this. Instead, I'll make those
comments here.

Your statement "Just looking at the math in his textbook one day
was all I needed to realize I DIDN'T HAVE A CLUE what thermodynamics
was about." is getting right to the heart of the matter. Pop level
accounts generally try to explain thermo without the math. This
is often done by introducing certain metaphors. Entropy can be
thought of as "disorder" and blah blah blah. This is a useful
metaphor for entropy but like all metaphors it can only be
taken so far. Problems arise when the metaphysicians and
Creationists start playing their word games with the word
disorder. Suddenly entropy begins to take on moral attributes :).
People forget that disorder is just a useful metaphor. Entropy
is the name for a parameter in a mathematical expression.
It gains its "meaning" (often counter-intuitive) from the
mathematics and not from the metaphors used to try to explain
the concept to the layman.

In information theory one has a similar problem in that
the word "information" simply does not mean what it does
in every day usage. For example, a monkey typing randomly
at a keyboard will produce a document with a greater
"information content" than _Origin of Species_ or
_Reason in the Balance_. Now, Chuck is likely to say that
this is utter nonsense and one doesn't have to know anything
about info-theory to recognize it as drivel. Its almost as
bad as his hypothetical accountant. Nevertheless, it makes
perfect sense once someone understands what "information
content" means in the context of information theory.

As with thermodynamic entropy, "information content" is
precisely defined and draws its "meaning" from the
mathematics and not from Websters dictionary.

To make matters worse, Shannon chose to call "information content"
entropy. There is a humorous anecdote about how Shannon decided
on this which is repeated in many texts on information theory.
Apparently Shannon wanted to call it a measure of information
content but hesitated for fear of confusion since "information"
has so many different meanings. He then discussed his little
"problem" with his friend, Von Neumann, who advised him to call
it "entropy" for two reasons:

"First, the expression is the same as the expression for
entropy in thermodynamics and as such you should not use
two different names for the same mathematical expression,
and second, and more importantly, entropy, in spite of
one hundred years of history, is not very well understood
yet and so as such you will win every time you use entropy
in an argument."

Now, with entropy meaning both disorder and information the
word games really begin. Everyone knows that things have a
natural tendency to proceed from order to disorder and everyone
knows that disordered things contain no meaningful information
and away we go .......

========================
Brian Harper | "I can't take my guesses back
Associate Professor | That I based on almost facts
Applied Mechanics | That ain't necessarily so"
Ohio State University | -- Willie Nelson
========================