Re: The Mess the Designer (?) Made (Shall We Rub His Little Nose in

MikeBGene@aol.com
Tue, 14 Dec 1999 10:44:16 EST

Reply to Chris-

Me:

> When Susan argued (and Chris agreed) that design implies
> states that are clean and smooth, and evolution implies
> states that are messy and opportunistic, I agreed and took
> it a step further:

Chris replied:

>I did *not* agree that design implies states that are clean and smooth. I
>said that messiness makes design *questionable*, not false.

If design does not imply clean and smooth states, why
does messiness make design questionable? What is the
basis for your questioning if not this implication?

>Obviously, *if* there is a designer, we don't know what his
>plans or means are, so we don't know that messiness is *not* part
>of his plan. But messiness makes the notion *suspect* (and that's
>all).

Well, I agree that messiness makes the notion suspect. But
I also approach this topic without relying on a double standard,
meaning lack of sloppiness makes the messy evolutionary
explanation suspect. Of course, I don't view messiness
or lack of messiness as a demonstration of any sort, but
I do view it as a measure of determining which way the
inferential wind is blowing.

>On the other hand, a nice clean orderly structure, such as
>might be designed by a human bio-engineer of the future with
>a lot of computer capacity at his disposal, should be *much* less
>complex than many existing cells (for the same functions).

Three things. 1. You seem to falsely equate complexity
with messiness; 2. You are simply guessing that designed
cells in the future will be much less complex; 3. You forget
that ID doesn't rule out that existing cells have a history of
evolution, meaning any messiness does not necessarily track
back to the originally designed state.

Me:

>And guess what? It has become increasingly clear that cells are anything
>but messy. This view may have been held in the 60s, but we now know
>that cellular life exists in such a way that precise timing, precise
positioning,
>and precise arrangement is crucial. Or consider my posts on proofreading.
>Every step along the pathway of information flow is proofread (DNA
>replication transcription, the charging of tRNA, the binding of tRNA to
>mRNA). This is the very opposite of messy and opportunistic.

Chris:

>Hardly. If it was as neat as you suggest, it would not *need* the
>proofreading process (or only to a much lower degree). The replication
>process would get it right the *first* time, if it was so good. This is the
>very opposite of neat and methodical. It is this very kind of kludginess
>that suggests that there *isn't* a designer (but, as I said, it does not
>*prove* that there is no designer).

Since Chris wants to single out replication, let's discuss it
some more, as later he will assert:

>This messy process is needed because the initial replication process
>is *so* error-prone, *so* messy, that proofreading is a necessity.

That Chris thinks replication (without proofreading) is *so*
error-prone and *so* messy suggests Chris doesn't understand
replication very well, as most scientists are impressed by
its fidelity.

Let's get some basic facts straight first. When DNA is replicated,
a new strand is synthesized by stringing together nucleotides
under the direction of the template strand. Now, the new strand
is synthesized at a rate of about 300-500 nucleotides/sec. And
without the proofreading function of the 3'-5' exonuclease
activity, the error rate is about 1 in 1x10^5 (proofreading
takes the error rate down to a range of ca. 1x10^10). But to truly
appreciate how good this is, one must consider that the
basis for discrimination is very subtle as the four
nucleotides are quite similar. Normally, A binds to T
and G binds to C, but mismatching is clearly possible
as H-bonds can form between C/A and T/G (through
wobble-like interactions). I have not read of any
free energy calculations, but I suspect the differences
between Watson-Crick base-pairing and some
of these other forms of base-pairing are small
to non-existent. In fact, what is interesting about
DNA pol I from E. coli is that it discriminates
through subtle conformational shifts that depend
on the base pair conforming to the geometry of
standard Watson-Crick base-pairs (where non-WC
base-pairs have only a modestly different geometry
with respect to distances between the C-1' carbons
of the sugars and with respect to bond angles).

To use an analogy, the DNA polymerase is not
reading a string of red, green, white, and black beads.
It is reading four beads that are closely continuous
shades of gray. Chris claims this is *so* messy
and *so* error-prone, demonstrating that this is the
"very kind of kludginess that suggests that there *isn't* a
designer," but I don't see the basis for such claims.
We're talking about discrimination near the threshold
where the basis for discrimination begins to fade away.
And we're talking about doing it very quickly and very
well (as an aside, an interesting thought to ponder is why nature
would evolve such an efficient polymerase if proofreading
mechanisms existed).

Now, why do cells require proofreading? Apparently,
even though DNA pol is about as good as you can get,
this is not sufficient for faithfully replicating the amount
of information needed to sustain and propagate life. Life
is more than complexity; it is specified complexity.
Of course, Chris might want to argue that a designer
should have designed another form of genetic material,
or perhaps another Universe with a different set of
Laws, simply in order to do without the handful of
gene products needed to proofread. But this "should
have" argument is purely a speculative metaphysical
argument where we have not a single shred of evidence
that these other imaginary and undefined states would
be any better. For example, say we design genetic material
whose characters are easier to distinguish. Easy to imagine,
eh? But what does this mean in terms of the metabolic
machinery needed to make the characters? Would we trade
the few proofreading gene products for even more character-
synthesizing enzymes that might require even more energy?
Would a template strand with very different characters
be more "bumpy" so that it slows the polymerase or makes
it harder to package the genetic material?

Sorry, Chris, but I see it entirely different. DNA replication
is not *so* messy after all. It is about as accurate as you can
get (even without the proofreading). Proofreading simply
means that life is built around specificity at a micro-level
so small that the ability to specify is nearly unobtainable.
That is, just about at the threshold where specificity becomes
possible, there we find the process we call life. This doesn't
speak of kludginess to me (a membrane bag full of random
second order reactions speaks more of kludginess). It speaks
of design at a level hard to imagine.

>And you are confusing cells with evolution. Susan was talking
>about the *process* by which things come about as an explanation
>for much of the kludginess of nature.

I know this. I'm simply don't think a messy process amounts to the
best explanation for a non-messy product.

>Because organisms either evolve or move or die out
>under strong selective pressures, and because these pressures keep changing
>directions without concern for the needs of the local organisms, there is a
>disjointedness in the accumulated changes, a degree of hodge-podgeness.

Yes, this is darwinism. And I do tend to agree, but I do so in a way
that doesn't employ the "heads I win, tails you lose" strategy. Namely,
if we are dealing with hodgepodge, disjointed, and kludgy things,
the inferential wind is pointing towards darwinism, but if we are
dealing with things not hodgepodge, disjointed, or kludgy, the
inferential wind is pointing away from darwinism (and towards ID).
Of course, there can be non-darwinian, non-ID causes, but until
someone spells these out, we really can't say what it implies (thus
we justifiably ignore it).

>Obviously, too much kludginess and organisms will go out of existence
>because the cost of maintaining the structure becomes too high, but there is
>still room for a significant degree of such kludginess because not only is
>each organism in the environment faced with the need to adjust, but so are
>its competitors, its predators, its prey, etc., so, while the survivors have
>to be "better" than the competing organisms that die out, they don't have to
>be *much* "better." They only have to be -- and often *are* -- just "good
>enough."

Yes, indeed. Natural selection, as the "designer," implies only
that things be "good enough." ID implies an engineering logic
behind the thing in question. For example, a rock is "good enough"
to pound a nail in the wall. But because it can do this, we don't
think it is designed to do this since it lacks the specified complexity
that maps to the function in question (a hammer fulfills this).

>I'd bet that when we find out enough about cells, we *will* find that they
>are not as "neat" as you suggest.

And if we do, the design inference would be weakened according.
That is, the more complexity looks non-specified, the weaker
the design inference. But I'd bet we will not find this, as this
would run counter to the trend that has been developing over
the last century. Let's not lose sight of our historical context,
when Darwinists and naturalists have a long history of
underestimating the cell. Cells were originally thought to
be very simple entities (not far removed from some primordial
slime). This gave way to a realization that cells were complex,
but still very messy (the bag of second order reactions). But as
I wrote originally, now we're finding that specificity is
everywhere (shape, size, positioning, timing, arrangement).
And we're finding that cellular processes are often built
around very subtle shifts in one specific states to another. For
example, the ability of bacterial aspartate receptors to
transduce a signal to activate previously inactive proteins
may involve slight piston-like movements of the membrane
spanning helices that amount to only one angstrom. If
cells were that messy, such *meaningful* signal transduction
would not occur by such slight changes.

>For example, it may turn out upon
>examination and experimentation that there are many *much* more compact ways
>of doing the things that cells do, but that cells "can't get there from
>here," because it would require steps that are too big for DNA evolution
>*or* steps in directions and along paths that are not permitted by current
>selection pressures. If we follow the development of structures and genes,
>we don't find any signs of this development being *planned*. Cells for an
>organism may be at a local minimum of complexity, so that *small* changes of
any one
>or several factors will disturb its functioning, but that does not mean that
>there are not much lower such minima that it can't (yet) reach because of
>the accidents of the past.

But you are confusing two concepts. Complexity/simplicity are
not the same as specificity/messiness. IMO, the focal point of
the design inference is specificity, as complexity is more of
an amplifier that serves to strengthen the design inference.
Put simply, the more complexity that is entailed in the
specified state, the stronger the design inference (although
specificity implies some degree of complexity).

Thus, I don't understand why a hypothetically more compact
state implies the complexity is really messiness. I suppose
it turns on how we define messy. Empirically speaking,
messiness has nothing to do with complexity. It is identified
only as the lack of specificity. On the other hand, perhaps
you are defining 'messy' along metaphysical lines, where
minimal complexity means a certain degree of complexity
is unnecessary. Yet once again, when we get into the
metaphysical realm, "should have been" arguments
all based purely on imagination. And I don't see how
a minimally complex state is entailed by ID (in fact,
this borders on the metaphysical argument that created
entities should have been created perfect). In fact, if life
was designed, I would not expect minimal complexity
at the point of design. Why? Well, life would have been
designed for a reason. And the obvious candidate for such
a reason was that life was designed in order to evolve/develop
into what we see and are today. Thus, one might expect to find
a certain amount of redundancy, as this extra complexity may
not have been needed for a life process, but was needed to
evolve (a perfectly simple life form would never evolve).
Or at the very least, the extra complexity would serve to
channel evolution into specific general directions (stacking
the cards, so to say).

Mike:

> If we are to propose a messy process generated cells, then
> for this proposal to be good science, it ought to carry
> implications about what we should find in the world and
> these implications should be turned into predictions that
> are then supported by what we do find.
>
> So what implications about life are made by the
> "messy process" proposal?
>
> It would seem to me a natural implication to propose
> messy products of a messy process. Common
> experience indicates this (I'm a fairly unorganized and
> messy person (process) and you should see my desk (product)).
> And it is unclear why natural selection would remove the mess, as
> natural selection cares only if "something works" and messy
> things can "work."

Chris:

>Natural selection *doesn't* remove the mess. That's the point, silly.

Exactly! So what removed the mess? It's one thing to claim
that a messy process would generate something as smooth and
clean as the cell, but to then add to this claim that all the
messy by-products of such a messy process would simply
disappear strains credibility.

(to be cont.)

Mike