Entropy (was Re: Human Designers vs. God-as-Designer)

From: David_Bowman@georgetowncollege.edu
Date: Sat Oct 21 2000 - 22:12:37 EDT

  • Next message: Susan Brassfield Cogan: "Re: IDer's ad hominems against evolutionist disassociated from (CSI, GAs,"

    Regarding the Ccogan/DNAunion/FMAJ entropy exchange:

    >>>Ccogan: No. I'm not implying that that one fact validates evolution. I'm
    >pointing out that matter has nothing against being organized in complex
    >ways.
    >
    >>> DNAunion: Actually it does: entropy. (Yes, localized decreases in entropy
    >are possible, but only at the expense of equal or greater increases in
    >entropy elsewhere: and the general rule is that the randomness and disorder
    >of a system tends to increase naturally). The problem with your statement is
    > that you incorrectly state that "matter has *nothing* against being
    >organized in complex ways." This is wrong.
    >
    >>>FMAJ: Nothing in the SLOT prevents matter becoming more organized.
    >
    >DNAunion: Well, except for the natural tendency of systems to go from states
    >of order to states of greater disorder. Matter *does* have something that
    >works against its being organized in complex ways. Of course, entropy can be
    >"circumvented" - the flow of matter and/or energy through a system can
    >generate local increases in order, for example, but that does not mean that
    >matter has *nothing* working against its becoming organized in complex ways.
    >
    >>>FMAJ: Sure, a local decrease in entropy needs to be offset by an increase
    >in entropy elsewhere but does this apply to matter or to energy?

    It applies to both. It applies *only* to the distribution of the
    microscopic states of the system in terms of its individual microscopic
    degrees of freedom. These degrees of freedom may describe ponderable
    matter with a nonzero rest mass (such as describing a massive particle's
    location and momentum) or may describe massless photons of radiation (by
    identifying how many photons of excitation are in a given mode of the EM
    field), or *any* other relevant physical degrees of freedom of the
    system. How you label them as being descriptive of 'matter' or 'energy'
    is not very important for this point.

    It's hard to make sense of DNAunion's claims and concessions in his last
    quote above. It appears that he is claiming that matter has a tendency
    to prevent its organization except when it doesn't. How much real
    content does such a claim really possess? Real laws of nature are not
    so vacuous.

    This whole above discussion is typical of what happens when people argue
    about the 2nd law, complexity, order, entropy, etc. without being careful
    about how they are using those words and/or have a precise understanding
    as to just what the 2nd law does and doesn't allow and what it does &
    doesn't address one way or the other.

    Q1: What does the 2nd law of thermodynamics cover, i.e. its scope?

    A1: It constrains the dynamical behavior as observed at the macroscopic
    level *any* identifiable physical system composed of a macroscopic number
    of microscopic degrees of freedom. That scope is global across all such
    kinds of physical systems, but it is quite specific concerning just what
    it actually describes in any particular system. What it requires is only
    that the *thermodynamic entropy* of the system (if it is effectively
    isolated from interaction with the rest of the universe) monotonically
    increases with time until the system comes to thermodynamic equilibrium,
    at which point its thermodynamic entropy is as large as possible for the
    system (consistent with all the macroscopic constraints that happen to be
    imposed on the system) and stops rising. At this stage the system's
    *macroscopic* state no longer is time dependent either.

    If the system is not effectively isolated from interactions with the rest
    of the universe then the *sum* of the thermodynamic entropy of the system
    *and* the thermodynamic entropy associated with that part of its
    environment that interacts with the system increases with time by virtue
    of the change of the system's (and relevant part of its environment's)
    macroscopic state until the system comes to equilibrium with itself and
    with the part of its environment that interacts with it. Again, once
    equilibrium has been achieved the system/environment's thermodynamic
    entropy and macroscopic state no longer change further with time.

    Q2: What is the thermodynamic entropy of a physical system?

    A2: It is the average minimal amount of further information necessary
    to determine, with certainty, just which microscopic state of the
    macroscopic physical system is its actual microscopic state given the
    full macroscopic description of it including all of the macroscopic
    constraints imposed on it. The ensemble of possible microscopic states
    from which the system chooses to be in as the actual microscopic state is
    the set of all microscopic states that are consistent with the given
    macroscopic description and are mutually accessible to each other by the
    microscopic dynamical laws of motion describing the interactions of the
    system's microscopic degrees of freedom. IOW, the system's thermodynamic
    entropy is the average minimal number of bits of information you would
    have to be given by a cooperative omnicient being so that you could be
    certain as to just which microscopic state the system is in at a given
    time when the only prior information you have about the system is its
    specification at the macroscopic level. Converting to SI units gives
    1 J/K of entropy = 1.04493 x 10^23 bits of needed info about the system's
    microscopic state.

    Q3: Does matter have anything against being organized in complex ways?

    A3: In general, no. The only conceivable special exception to this which
    may have any remote relevance whatsoever to the 2nd law of thermodynamics
    is *if* one *identifies* what one *means* by the system's so-called
    'disorder' to be identical with the thermodynamic entropy of a system,
    so then the 2nd law's insistence that the thermodynamic entropy of the
    system + that of those parts of its environment interacting with it not
    decrease with time means that this particular (and peculiar) measure of
    this total 'disorder' doesn't decrease.

    Typically such an identification would be an unfortunate choice of
    definition for the concept 'disorder' in general. Usually, we think of
    'order' and 'disorder' in terms of possible patterns and symmetries of
    arrangements of things. Entropy doesn't necessarily have anything to do
    with such concepts. Rather, it is only a measure of net uncertainty or
    total ignorance about the identity of the system's microscopic state
    given only a knowlege of its macroscopic state. It doesn't concern
    itself with any possible patterns and symmetries or their absence in
    general.

    The reason that thermodynamic entropy got associated in the popular mind
    with disorder in the first place is that many introductory textbooks
    made (and many still make) this association so to give the otherwise
    ignorant beginning students something to mentally picture for the
    abstract concept of entropy. The connection is as follows: Typically
    when a collection of objects is arranged in a particular patterned
    arrangement it requires less information to specify the particular
    patterned arrangement than when there is no such pattern. So if one
    considers the amount of information needed to be communicated to an
    otherwise ignorant agent so that agent can reconstruct the patterned
    arrangement to be some measure of the arrangement's disorder, then the
    more orderly the pattern the less info needs to be communicated, and
    the more disorderly the arrangement the more info is needed. Since
    entropy concerns itself with missing information or ignorance, the
    association was sort of natural. Also, for example, when a thermodynamic
    system has its atoms arranged in a nice orderly crystal it requires less
    info to specify the arrangement than when those atoms are haphazardly and
    randomly arranged as in the melted liquid. The orderly crystal *does*
    have less entropy than when the system is a melted liquid.

    But there are multiple problems with this naive and fuzzy identification
    of entropy and disorder.

    First, the *only* kind of "disorder" that has any relevance to anything
    associated with the 2nd law of thermodynamics, and has any relevance
    at all to any so-called nature of matter to have a "tendency towards
    disorder" is the "disorder" associated with the physical system's
    thermodynamic entropy. This is *only* concerned with the system's
    *microscopic states* and any possible "order/disorder" at *that* level.
    It has *nothing* directly to do with any patterns, or ordering that may
    or may not form at any macroscopic level of description. Physical
    systems have *no* problem spontaneously forming any ordered or complex
    arrangements at an aggregate or macroscopic level. That's why such
    things as oak trees form from acorns, hurricanes form from masses of
    unstable humid airover warmocean water, regular mud cracks form when a
    muddy field dries out in the hot direct sunlight, stars and planetary
    systems form from interstellar clouds of gas & dust, etc. etc. all with
    no problem whatsoever from the limitations of the 2nd law of
    thermodynamics. In fact, such macroscopicly organized systems
    spontaneously, form *by* the action of the 2nd law when a sufficiently
    far-from-equilibrium condition is maintained by the boundary conditions
    imposed on the system as it interacts with its environment in ways that
    generate more total thermodynamic entropy in the universe at a higher
    rate when the system organizes than would be the case if the system
    remained unorganized at the macroscopic level.

    Second, *even* if one restricts one's notion of the "disorder" to be
    associated with only that of the microscopic arrangements of the atoms
    then this *still* does not get at the essence of what the entropy is
    about. For a system made of effectively classical particles its
    entropy is the mean minimal amout of information necessary to determine
    the microscopic state of each of the system's atoms. This info must
    be sufficient to identify each particle's precise location in space
    *and* to determine each particle's precise momentum (up to the precision
    restrictions imposed by the Heisenberg uncertainty principle). So just
    the specifiying or determining the precise arrangement in space of the
    atoms is not the whole story. We also need the all the precise momenta.

    Third, *even* if disorder in momentum space as well as in position space
    is included in our notion of disorder for the particle arrangements, and
    *even* if we restrict our notion of disorder to only include the disorder
    of the microscopic states of the system, then this *still* doesn't really
    get at the system's thermodynamic entropy because the thermodynamic
    entropy is the *statisticical average* of the minimal amount of
    information necessary to *identify* or *determine* the microscopic state
    out of a pool or ensemble of all the allowed microscopic states. Whereas
    a typical notion of "disorder" is more closely associated with (and is
    defined in terms of) the usual notion of *complexity*. Complexity is
    *not* entropy. Typical complexity measures are defined in terms of the
    amount of information needed to *reconstruct* or prepare any given
    arrangement. The information associated with complexity is *not* the
    same kind of of information as is relevant for entropy. Each given
    instantiation of a microscopic state has its own *individual* complexity
    and has its own given order/disorder value defined in terms of that
    complexity measure (along with a measure of the complexity of the most-
    ordered state allowed) for that microscopic state. The entropy, OTOH, a
    statistical average over the ensemble of possible microscopic states; it
    is not defined for the individual microscopic states themselves.

    Fourth, since as such complexity measures involve the information needed
    for the full reconstruction of a given state, this information is not the
    same kind of information that the entropy cares about. The entropy cares
    about just an *enumerative* kind of info which is only the info needed
    to merely *pick out* or identify (with certainty) the actual microscopic
    state from a *numbered list* of such states--*not* all the info needed to
    completely reconstruct a given state. The info needed to choose a given
    numbered item is just the number of bits or digits needed to encode the
    listed number of that possibility. For example, *if* our (very
    unrealistic) system has only a million (10^6) possible microstates, and
    if each one of them are equally likely as all of the others, then the
    entropy of this set of possibilities is just 6 decimal digits (or 13.8
    bits) no matter *how* complicated the individual microscopic states
    happen to be to build. If the number of equally likely microscopic
    possibilities is 2^N, then the entropy of that ensemble is just N bits
    regardless of the complexity of the construction of each possiblity.

    The general notion of entropy is much more general that that of the
    thermodynamic entropy which is the only kind of entropy that has any
    relevance for the 2nd law of thermodynamics. In general, the entropy is
    a property or statistic which is possessed by a probability distribution.
    Such a distribution assigns a given probability, p_r to each possible
    outcome r where the index r ranges over a list of the outcome
    possibilities for the distribution. The entropy of the distribution is
    S = SUM{r, p_r*log_b(1/p_r)} where the sum ranges over each outcome r,
    and the log_b(...) means the logarithm taken to the base b. The value of
    b is the number of distinct symbols in teh symbol set used to encode the
    information that identifies the possibilities. If the info is in terms
    of a binary string of 0's & 1's then the base is b = 2, and if the
    symbol set is the set of 10 distinct single-digit Hindu-Arabic numerals
    ranging from 0 - 9, then the value of the base is b = 10. If the info
    is encoded in bytes (with 256 distinct bytes possible) then the base is
    b = 256. An important special case happens when all of the possibilities
    are equally likely; then the entropy functional boils down to the simpler
    formula S = log_b(N) where N is the total number of possibile outcomes
    for the random process.

    Thermodynamic entropy is just a *special case* of this general notion of
    entropy. In the case of thermodynamic entropy, each of the set of
    outcomes/possibilities {r} just label each of the set of all the possible
    microscopic states that are consistent with the system's macroscopic
    state, and each p_r is the probability that the physical system is in
    its r-th microscopic state. But it is *only* the thermodynamic entropy
    (i.e. entropy of the probability distribution for the microscopic states
    compatible with the system's macroscopic description) that has *anything*
    to do with the 2nd law. Any entropy of any other distribution, (such as
    for some distribution of various possible complex macroscopic patterns)
    is not a concern one way or the other of the 2nd law of thermodynamics.

    >DNAunion: I may not understand your question, but a flow of matter/energy
    >through a system can "overcome" the local natural tendency towards disorder.

    Any such flow itself *is* a natural tendency. There is nothing to
    "overcome" here. Laws of nature are not "overcome" or abrogated by other
    natural processes. (At least in physics they aren't.)

    >>>Ccogan: Thus, the question arises: Might not some small bits of it become
    >complex through natural, material processes not involving design?
    >
    >>>DNAunion: Sure, matter can become *ordered* without design: the birth of
    >stars, the spontaneous formation of vortices when water is let out of a
    >drain, clouds forming from dispersed water droplets, etc. But these examples
    >of order forming do not deal with specific complexity arising by purely
    >natural means, and specified complexity is one of the main properties of all
    >life.

    DNAunion's own examples here show that matter does not have the
    "tendency" he supposes.

    The 2nd law is not at all concerned one way or the other with the
    presence or absence of specified complexity. Any appeal to the 2nd law
    of thermodynamics and to thermodynamic entropy to attempt to justify a
    supposition that only ID can create specified complexity is a red
    herring, non sequitur, and just plain wrong.

    >>>FMAJ: So show how specified complexity cannot be formed by evolutionary
    >pathways?
    >
    >DNAunion: I already gave a general example: the latest issue of "Origins of
    >Life and Evolution of the Biosphere".
    >
    >Now, if I could give honest-to-goodness, verifiable, valid examples from
    >biology - and fully back them up in excruitiating detail - then we probably
    >wouldn't be having this discussion at all as ID would have catapulted up in
    >scientific credibility. Since ID has not made the leap, I feel it safe to
    >conclude that no IDist has yet been able to create an airtight case for
    >specified complexity's not being able to arise by natural processes
    >("airtight" being required for IDists).

    This is a refreshing bit of candor. So instead of simply demonstrating
    that 'mindless' natural processes are incapable of generating specified
    complexity as DNAunion and other IDers claim, he (and often others)
    instead, try to substitute fuzzy and invalid appeals to entropy, 2nd law
    or some supposed ill-defined tendency of matter toward "disorder" hoping
    that the audience will be duped, and only if they object will it be
    conceded that such appeals are, in fact, without any real foundation.
    If you want to take the claim of the impossibility of natural processes
    to be able to generate specified complexity as a matter of faith, that's
    fine. Just don't go around also trying to also claim that these IDist
    articles of faith are a form of science.

    David Bowman
    David_Bowman@georgetowncollege.edu



    This archive was generated by hypermail 2b29 : Sat Oct 21 2000 - 22:17:26 EDT