Merv & David:
A few comments.
First, I don't understand the seemingly discontinuous comment about the
Planck length.
Second, I don't follow the argument. It seems to me that from beginning
to end they are discussing epistemological uncertainty and not
ontological uncertainty. In fact, it seems to me that the Heisenberg
uncertainty can be similarly interpreted.
Since I don't consider the Heisenberg uncertainty to really get at the
matter (it can be viewed as merely the result of not attempting to
measure an eigenvalue), consider instead something like the decay of a
radioactive nucleus.
We are told that if one were to ask why this particular nucleus decayed
at this instance that the "appropriate" answer is that there is no
reason. Yet, we are also told that the statistical decay of a host of
such atoms has such a small variance that we can make extremely accurate
atomic clocks from them.
The situation is analogous to tossing an honest penny. If one were to
try to predict whether this penny on this toss would be a head or a
tail, our knowledge would be completely uncertain. All we could say is
that it will be either a head or a tail. And this is why we in Bayesian
fashion say that the result is 50-50, a measure of complete ignorance.
Yet, were we to toss 10^23 such coins we could predict with
extraordinary accuracy the fraction of coins that are heads and the
fraction that are tails.
In this analogical story would we say that there was no reason that the
flip of a single coin came up heads? I don't think so. Such a story
was well known long befor QM came along, and no one was led to argue
that we live in a random universe. Well, maybe not no one. It was
probably a common belief prior to the advent of modern science.
I know that what I'm suggesting seems to lead to hidden variables. I've
just never quite understood the claim that we live in a random universe,
which appears to imply what?
Is a random universe that is unpredictable? That's epistemological.
Ontologically, it must mean something like events occur for no reason
whatsoever, and yet they are statistically deterministic. This appears
to me, at least, to be a paradox. Does ontological randomness entail
that events occur without any antecedent conditions, not just
unobservable, but none whatsoever. Even with the pennies there are
antecedent condtions: the penny must be tossed.
In summary, I don't get Polkinghorne's argument. Please, explain.
thanks,
bill
On Thu,
1 Oct 2009,
mrb22667@kansas.net wrote:
> My comments injected below...
>
> Quoting David Clounch <david.clounch@gmail.com>:
>
>>
>> Polkinghorne and Beale write about determinism and the
>> brain1<#sdfootnote1sym>
>>
>>
>> Consider a single nitrogen molecule in the air you are now breathing. On
>> average it is traveling 450 m/s and bounces off about 7 billion other air
>> molecules every second, thus 7,000 every microsecond. Suppose you knew the
>> exact position and momentum of every one of these particles (even though
>> this is impossible by Heisenberg's uncertainty principle), then perhaps you
>> could, at least in principle, predict exactly where that nitrogen molecule
>> would be one microsecond later. Of course there are all kinds of
>> complications, such as electrostatic forces, angular momentum, and so on,
>> but lets make it simple and pretend that these were all perfect spheres and
>> Newton's laws exactly applied – the kind of eighteenth-century worldview
>> that shaped the Enlightenment and still influences much of our thinking. But
>> suppose a tiny error is introduced in the angle at which this air molecule
>> is traveling, for any reason at all. A little bit of uncertainty about the
>> position of an electron, say. Call this error ï¥(epsilon). After one
>> collision, the error is 2 ï¥; after two collisions 4 ï¥, and so forth.
>> Each
>> microsecond this error will increase by 2^7000, or roughly 10^2100. The
>> situation is clearly hopeless even if the initial error corresponds to a
>> Planck length (1.6 x 10 ^ -35 m – the smallest possible length, at which
>> conventional physics breaks down) per meter, after just 97 collisions the
>> uncertainty will be enough for the position of the molecule to be out by
>> more than the diameter of a nitrogen molecule (6.2 x 10^-10m), which means
>> it will miss the 98th target. This will happen in less than a 70th of a
>> microsecond. And making the error one Planck length in the size of the
>> observable universe (about 3 x 10 ^23 m) just means it will miss the
>> 176thmolecule. So even with the unrealistic assumptions of a perfect
>> Newtonian
>> world elsewhere, exact determinism is dead.
>
> It isn't the error amplification (chaos theory) that kills determinism. Because
> the original 18th century thought assumed up front that such knowledge was
> impossible anyway, they had already premised their speculation as being so *in
> principle* since they knew nobody could know all this. And that caveat allows
> them (and us now even with Chaos theory) to reduce the initial state error *in
> principle* to zero (infinitely smaller than a Planck length). So it is only the
> Heisenberg uncertainty as mentioned below that actually drives the real stake
> into the heart of determinism. Yet for all this, it doesn't prevent some from
> still thinking deterministically about the universe as a strictly causal domain.
> Since my mind can't fully fathom the nature of our ontological uncertainty, I
> find myself in this deterministically minded camp at least every other Thursday.
> Maybe the atoms in my brain will happen to bounce that way today.
>
> --Merv
>
>> In fact, of course, we use
>> statistical mechanics to describe the behavior of gases and liquids and do
>> not try to predict the behavior of individual small molecules. But many
>> people think of the indeterminacy in statistical mechanics as simply a
>> limitation on our knowledge rather than a reflection of real indeterminacy
>> as in the quantum world. This kind of argument strongly suggests, to our
>> satisfaction at least, that in cases like the movement of molecules in air
>> the indeterminacy is real.
>>
>>
>> They go on to describe calcium ions in te synapses in the brain, and use a
>> similar analysis. They conclude:
>>
>>
>> We will see later that this entirely destroys the idea that the brain is a
>> fully deterministic system.
>>
>>
>>
>> 1 <#sdfootnote1anc>Questions of Truth, pp. 126-127
>>
>
>
>
> To unsubscribe, send a message to majordomo@calvin.edu with
> "unsubscribe asa" (no quotes) as the body of the message.
>
To unsubscribe, send a message to majordomo@calvin.edu with
"unsubscribe asa" (no quotes) as the body of the message.
Received on Thu Oct 1 09:15:01 2009
This archive was generated by hypermail 2.1.8 : Thu Oct 01 2009 - 09:15:02 EDT