Re: geology, good science and a quest for info

From: David Bowman (David_Bowman@georgetowncollege.edu)
Date: Thu Sep 20 2001 - 17:55:55 EDT

  • Next message: Michael Roberts: "Re: geology, good science and a quest for info"

    Thanks George for your respose to my comments.

    Regarding:

    >Evidently, we agree on the main point that it is necessary to define
    one's
    >system when speaking about entropy since you too allow for an entropy
    >decrease at the cost of the surroundings;

    Quite true.

    >however, you confine yourself to equilibrium thermodynamics.

    I didn't mean to so confine myself. My point was that *even* in the
    most simple *near* equilibrium situations the entropy of some local
    subsystems decrease. We don't need to even go far from equilibrium
    for it to happen. Of course if we *do* look at far-from-equilibrium
    systems we can see entropy locally decreasing in subsystems of them,
    too.

    >Contrarily, far--from--equil. systems are not exotic at all.

    I didn't mean to suggest that they were exotic in any absolute sense.
    Just that they are more complicated to understand than
    near-to-equilibrium systems, and as such, require more exotic methods
    to deal with when attempting to understand their more complicated
    behavior. The term 'exotic' was meant to be only *relative* to
    equilibrium and near-to-equilibrium systems. It was not used in a
    way so as to suggest that they were rare or hard to make. Maybe I
    should have chosen a better word to describe my point--maybe
    'macroscopically complicated'.

    >They are very common and quite literally define the regime
    >in which "the action is" when discussing large scale evolution; i.e.,
    not
    >fluctuations about equilibrium.

    True, living things *are* far from equilbrium in many important
    respects. My point, though, was that to refute the notion that the
    2nd law forbids entropy decreases, or that they only can happen via
    special "energy conversion systems", or via mechanisms controlled by
    some ill-defined "informational program", we only need to look at
    things as simple as an object cooling in the presence of a cooler
    neighboring object that absorbs some heat from the hotter object to
    find counterexamples to such an erroneous claim.

    >Respectfully, it is my opinion, that the notion of infinitesimals is in
    >fact very exotic in that they are mathematical constructions created to
    >avoid actual infinities.

    Again, it seems that my choice ot words may have been ill-advised.
    When I used the word 'infinitesimal', I did not mean to connote all
    the mathematical nuances of a epsilon-delta contruction or a
    quasi-static limit in terms of a sequence of neighboring equilibrium
    states. I merely meant to say that in order for the hotter object to
    lose entropy it only has to give up some heat to a cooler region in
    its surroundings, and that the amount of temperature difference
    between them for it to happen can be quite small indeed. We can be
    as close to an equilibrium situation as we desire, and the result of
    a decreasing entropy for the cooling object *still* holds. Being far
    from equilbrium is quite unnecessary for finding a counterexample.

    > ...
    >Calculus later introduced infinitesimals to save us from going mad
    >contemplating infinities :-) .

    This is fortunate.

    >> The entropy of the Sun is *decreasing* (maybe Andrew just mistyped
    >> his comment here)--recall the temperature of the Sun is higher than
    >> that of interstellar space.
    >
    >Thank you, I did mean decreasing; (BTW, if you were referring to me, I
    am
    >George.)

    Oops, sorry. I did mean you George Andrews. Apparently my brain
    accidentally permuted your first and last names in that above quote.

    >But entropy to which YEC refer is that which is defined by the second
    law;
    >i.e., in terms of energy (thermal) or logarithmic functions of
    accessible
    >phase--space state densities (Stat. Mech.). (There really are no
    others;
    >just analogies to this one; e.g. information entropy.)

    Actually, the thermodynamic entropy of a system is a *special case*
    form of 'information entropy'. The thermodynamic entropy of a given
    macroscopic physical system is the average (minimal) amount of
    further information necessary to determine with certainty the exact
    microscopic state of that system, given only the specification of
    that system's macroscopic state. The conversion factor between
    thermodynamic entropy in J/K and in bits is:
    1 J/K = 1.0449388(18) x 10^23 bits .

    But it would be very wrong for a YEC or IDer to conclude that just
    because thermodynamic entropy is a special case of an information
    entropy, and thermodynamic entropy obeys the 2nd law, that other
    special cases of other kinds of information entropy (such as the
    Shannon entropy of a particular ensemble of nucleotide or amino
    acid sequences) must also obey that law.

    David Bowman
    David_Bowman@georgetowncollege.edu



    This archive was generated by hypermail 2b29 : Thu Sep 20 2001 - 17:56:23 EDT