Re: Random processes create meaning

From: mortongr@flash.net
Date: Thu Sep 21 2000 - 15:04:36 EDT

  • Next message: William T. Yates: "Geocentricity"

    Hi Brian,

    You wrote:

    >Hi Glenn. As usual, you have a very interesting and creative idea. I
    >believe, though,
    >that I can prove the opposite of your claim either by Shannon information
    >theory
    >or algorithmic information theory. I can fill in a few details if anyone
    >wants to see it.

    >Based on this I'll make the prediction that what we have here is really an
    >example
    >of ID. You constructed an interesting message and then worked backwards to get
    >the keyword. Am I right?

    I will absolutely agree that I worked backwards. But that doesn't make it ID necessarily. I will explain below.

    >I think the following would be an interesting experiment to illustrate how
    >far off the
    >mark your example is. Suppose you and I and anyone else who wants to play
    >constructed 10 keys at random and then used them to decode the intercepted
    >message. Everyone then posts their decoded messages here and we'll see how
    >many are intelligible English sentences. The problem is that messages having
    >the statistical structure of English are in the low probability group for
    >randomly
    >generated sequences. The probability of getting *any* English message is
    >thus close to zero and approaches zero as the message length increases.

    First off I think you missed the point. I didn't say anything about Shannon entropy, I didn't say anything about the low probability group. You and I agree on this. What I was pointing out is that the statements by these gentlemen that meaning can't come from random processes is simply wrong. I didn't make any claim as to how often or what the frequency of it is. What I clearly demonstrated is that the statements like those I quoted are simply wrong and should not be taught.

    As to randomness, the first keyword in my example

    plmoezqkjzlrteavcrcby

    WAS generated by a random process. This totally random sequence was then convolved with a meaningful sentence yielding a sequence that has no more structure than the random sequence.

    pefogjjrnulceiyvvucxl.

    By all appearances this becomes a random piece of noise as is static on the radio. If the static is great enough, you can't understand the singer or speaker. If the static noise is has a power equal to or greater than the power of the speaker, no amount of processing the signal will bring the speaker in clear. This is the state of encoded sequence above. The noise of the random sequence is as great or greater as the signal of the meaningful sentence 'attack the valley at dawn'.

    Now, lets continue the radio analogy further. Assume that you want to talk to me in total privacy with no listeners capable of hearing us as we chat about some crime we wish to purpetrate. You call me and tell me to tape the phone conversation digitally. You can get a white noise generator, play it against the phone and speak to me as I record our conversation. No one will be able to decipher what you say, including me. But if you record the noise exactly and record it apart from your voice, and then send me the tape, I can successfuly subtract that noise from the tape and hear you clearly. But no one else can do that unless they intercept and copy the tape of the noise. I can't even use my tape of the conversation to remove the noise as my tape is of both the conversation AND the noise and as you know, addition is irreversible unless one knows the initial values. Having your tape of just the noise gives me one of the inital values and all I have to do is subtract one fr!
    om!
     the other.

    That is what a keyword is. It is taped noise that can be subtracted from the signal. Noise is random as are all the keywords. Are their noise streams that will take your tape and return a different message? Of course. The noise can be random but it will produce a meaningful statement from you albeit the wrong message.

    My point is that one can't say that randomness can't produce meaning or specificity. We will reserve for a later time the discussion of the frequency of such noise/keyword streams.

    Only one comment along that line with short sequences, the probabilities of finding things by random search becomes feasible. And also, there is the phenomenon of Directed Evolution in which biopolymers of length 100 mer or nucleotide are found to perform useful functions at the rate of 10^-13 which means a medium sized vat can find useful biopolymers for a particular function out the gazoo.



    This archive was generated by hypermail 2b29 : Thu Sep 21 2000 - 15:04:41 EDT