Re: Entropy (was Re: Human Designers vs. God-as-Designer)

From: DNAunion@aol.com
Date: Tue Oct 24 2000 - 17:13:13 EDT

  • Next message: Stephen E. Jones: "Re: Examples of natural selection generating CSI"

    >> DNAunion: All life requires that it actively maintain itself far above
    thermodynamic equilibrium. For an acorn to grow into an oak, it must fight
    against, and "overcome", entropic tendencies at every moment along the way.
    This example does not contradict my statements.

    >>FMAJ: Exactly. This far for equilibrium thermodynamics is exactly
    whatdrives evolution and creation of completity. So what does this show?

    >>DNAunion: It shows that there *is* something that opposes matter's being
    organized in complex ways, which must be continually fought: when it is
    battled, it *can* be "overcome".

    How many times do I have to explain this. I am *not* stating that increases
    in order or complexity *cannot* occur, just that in order for them to
    occur,that entropy must be *overcome*. Entropy *is* something that opposes
    matter's being arranged in organized and complex ways.

    >>Chris: Actually, I don't think that the complexity of the Universe as a
    whole changes at all. *Organization* changes, of course. But, randomness is
    as complex as you can get; it's just not organized in ways that we would
    recognize as such.

    The complexity of randomness is what makes the claims that random processes
    cannot produce complexity ironic; that's what random processes are *best* at
    producing. What they are not so good at is producing simplicity and
    systematic organization.

    DNAunion: Good point about complexity related to randomness.

    Seeing that David Bowman is so much more informed on the subject than I (not
    being smart, he obviously does know much more than I do on entropy,
    thermodynamics, and its relation to complexity), I would like to ask him a
    question.

    I have heard both complexity and randomness defined in terms of a measure of
    the degree of algorithmic compressibility.

    HHTTHHTTHHTTHHTTHHTTHHTTHHTT

    HTTHHHTHHTTHTHTHHHHTHHTTTHTT

    The first coin flip example can be described by "repeat HHTT 7 times". Even
    if the sequence were extended out to a billion symbols, the description would
    become hardly longer, something like "repeat HHTT 250,000 times". The
    sequence has a very short mimimal algorithm that fully describes it, so is
    not random (nor complex?).

    But the second coint flip example has no such shortcut. The shortest
    algorithm that fully describes the sequence is (for all we can tell), the
    sequence itself. Therefore, it is random (and/or complex?).

    PS: I also understand that a more detailed look for randomness would involve
    taking the symbols 1 at a time, then 2 at a time, then 3 at a time, ... and
    seeing how well the observed distribution of possible symbol grouping matches
    the expected distribution. Does this also apply to complexity also?

    Basically, in terms of symbol sequences, what is the difference between
    randomness and complexity? I have heard even physicists (in works directed
    at a general audiences) use the terms interchangeably.



    This archive was generated by hypermail 2b29 : Tue Oct 24 2000 - 17:13:39 EDT