Re: pure chance

Brian D. Harper (harper.10@osu.edu)
Tue, 31 Dec 1996 01:23:37 -0500

At 10:16 AM 12/30/96 -0800, Greg wrote:

[...]

>
>Thanks for the explanation of the authors derivation of increasing
>information with mutation. I'm still concerned, though, that they
>are taking an incorrect view as to how broad their window into the
>genome needs to be to get correct readings about information increase
>or decrease. For example, take the mutilated English word *uestion.
>How much information do you get by knowing that the * stands for 'q'?
>Practically none at all. And this is despite the fact that the
>frequency of 'q' is very low in English! You would be silly to
>suggest 'e' for the *, and the reason is that you are considering
>the surrounding text (the word fragment, in this case).

Hopefully you'll read my reply to Gene as I think your main concern
is that you're expecting information theory to address "information"
in the more usual sense of the word. Remember Yockey's illustration
about being unable to send Manfred Eigen a package labeled "gift"
since gift means poison in German. In your example *uestion, you
realize q is appropriate because you have English words in mind.
If you showed the same thing to someone knowing only Chinese
they would have no clue about q. The clue comes from an
understanding about arrangements of letters, this understanding
is not contained in the letters themselves.

> Sure, if
>you think DNA is like marbles stuck together, and don't consider
>long-range correlations in your prior distribution, the author's
>derivation you refer to is quite correct. I remain unconvinced,
>though, that it actually *means* anything useful.
>

As I said in my reply to Gene, I don't consider myself knowledgeable
enough to really comment on how useful information theory is.
If you are expecting it to give you some "information" about the
functionality of a protein or DNA sequence then I think you
will not consider it of much use. Functionality would be analogous
to the meaning of the word "gift" in the illustration above. But
one doesn't need info-theory for this. There are other methods
available for determining functionality.

Brian Harper | "If you don't understand
Associate Professor | something and want to
Applied Mechanics | sound profound, use the
The Ohio State University | word 'entropy'"
| -- Morrowitz
Bastion for the naturalistic |
rulers of science |