Yockey#1

Brian D. Harper (bharper@magnus.acs.ohio-state.edu)
Mon, 26 Feb 1996 14:05:59 -0500

Yockey#1 Response to various reviews of his book,
posted to bionet.info-theory

=======================================================================
From: hpyockey@aol.com (HPYockey)
Newsgroups: bionet.info-theory
Subject: Book Reviews of Information Theory and Molecular Biology
Date: 24 Jan 1995 10:49:13 -0500
Organization: America Online, Inc. (1-800-827-6364)
Lines: 432
Sender: root@newsbf02.news.aol.com
Message-ID: <3g37hp$e7b@newsbf02.news.aol.com>
Reply-To: hpyockey@aol.com (HPYockey)

Review of book reviews of Information Theory and Molecular Biology by
Hubert P. Yockey published by Cambridge University Press 1992

Why is information theory and coding theory important in molecular
biology? Gregor Mendel proved that inheritance is particulate and does not
blend. Morgan showed that inheritance is linear. Watson and Crick
demonstrated that inheritance is digital. Thus the genome is a message
recorded in digitial fashion in sequences of nucleotides in DNA. While
Nature has been recording the life message digitally and using a code for
at least 3.8 billion years, it is only recently that modern communication
engineers have discovered the benefits of recording and transmitting
messages in digitized form. The genetic message is isomorphic with
messages in general and the genetic code is isomorphic with all codes.
This means that information theory and coding theory are essential to
molecular biology.

Is life to be explained ONLY by its chemistry? The fact that inheritance
is particulate, linear and digital shows that life must be more than just
complicated chemistry. There is no trace in non living matter of chemical
reactions being directed by a message recorded digitally in an alphabet,
nor of a code between alphabets. If the genetic processes were purely
chemical it "stands to reason" that many more than 20 amino acids would be
used in the formation of protein, yet only 20 of the hundreds of possible
amino acids are used directly in the formation of protein-the same 20 for
ALL organisms everywhere, past, present and future. If the genetical
processes were purely chemical the law of mass action would govern the
placement of amino acids in the protein sequences in accordance with their
concentration. On the contrary, the genetic message puts each amino acid
just where it belongs, largely independent of their concentration.

Henry Quastler was the first to realize the importance of information
theory in molecular biology. Quastler organized a symposium at the
University of Illinois and published the proceedings as Information Theory
in Biology, University of Illinois Press (1953). A symposium on
Information Theory in Biology was held at Gatlinburg, Tennessee in October
1956, under the auspices of Oak Ridge National Laboratory. Pergamon Press
published the proceedings in 1958, edited by Hubert P. Yockey, Robert
Platzman and Henry Quastler. George Gamow attended the symposium and
contributed a paper.

The field has now been brought up to date in Information Theory and
Molecular Biology by Hubert P. Yockey published by Cambridge University
Press in 1992. The book has had six book reviews. It is timely to look at
the impression these reviewers had. Here are some excerpts:
"The book is written in a very lucid style. This interesting, and
sometimes entertaining, book provides a timely summary of the field.xThis
is a work that no one interested in the fundamental relationship between
physics and biology can ignore.x There is, still, much to commend in this
book. The style is lively and often humorous. xInformation theory is, as
explained in the prolog, the mathematical foundation of molecular biology.
Indeed, Yockey unfolds in the monograph a comprehensive self-consistent
philosophy of molecular biology. It is this philosophy and its way of
presentation that intrigued me throughout the book. I was fascinated by
the author's rich style and broad erudition. Furthermore, I was stimulated
by his thought provoking arguments against the conventional methodology of
science in general and of science related to the origin of life in
particular."
These comments are high praise for a technical book full of mathematics.
Such books are supposed to be pedestrian, prosaic, dull and dry.
An unsigned review appeared in Cellular and Molecular Biology December
1993.
"These theories on information and on coding have been developed by Yockey
to deal with communications problems in a quantitative and mathematical
manner in molecular biology, in order to overcome the gap of lack of
communication between the molecular biologists and the mathematicians,
providing mathematical definitions for the vocabulary with which basic
question in molecular biology are debated. the book is divided in two
parts. The first consists in an introduction to mathematical concepts of
information theory and demonstrates the manner to relate them to the
genetic code. The second part illustrates the application of these
fundamental ideas to molecular biology. The question is how to use these
to analyze and test our ideas of the genetic code. The author begins his
demonstration from the first information carrying macromolecules and the
early genetic code and how information may be transmitted in aging cells.
Such a book is not only useful for advanced undergraduates, but merely for
graduates and full developed research scientist in cellular and molecular
biology and evolution."

Book Review by J. Napur New Delhi, Mathematical Reviews 94b:92015
Napur describes the book noting that applied mathematicians in search of
exciting fields will find the book useful. He concludes his evaluation:
"The book is written in a very lucid style. The roles of physics,
chemistry and mathematics in biology are discussed again and again and the
similarity between theoretical physics and theoretical molecular biology
is always brought out. It is emphasized that molecular biology
applications of information theory and coding theories should explain the
known experimental data or should suggest further experiments for
verification of theoretical results obtained."

FEBS Letters 1993 volume 327 by Gordon Atkins
The reviewer describes the book and gives his evaluation:
"This interesting, and sometimes entertaining, book provides a timely
summary of the field. The mathematics is formally and systematically
presented to a degree not usually found in books aimed at biologists.xDr.
Yockey's treatment of the subjects within this section is comprehensive
and rigorous and an important feature is his critical look at the
sometimes unsound conclusions of biologists. This is an excellent book,
well-written and enjoyable to read. It can be read by anyone with an
interest in the area of molecular biology and evolution---most of the
second section is, in fact, comprehensive without the detailed knowledge
of mathematics presented in the first."

There is always a subtle and implied impression that the reviewer was
chosen because he is more informed than the author and could have written
a better book if only he had put his shoulder to the wheel, his eye on the
stars, his feet on the ground and his hand to pen and paper. Book reviews
may in fact provide an opportunity for the reviewer to exhibit his
misunderstanding of the book. This was the case with the following
reviewers, one who happened to be a professor of physics and two who are
professors of chemical physics.

Nature volume 362 page 509 (1993) published a review by Professor Hermann
Haken at the Institut fuer Theoretische Physik und Synergetik of the
University of Stuttgart. Haken has no problem with the mathematics in Part
I that prepare the reader for specific practical problems in molecular
biology. When these results are applied Haken finds that I have committed
heresy by pointing out the deficiencies of some popular beliefs in
molecular biology and lese majesty by taking in vain "seminal work (sic)
of Professor Manfred Eigen".

"In several places the author (Hubert P. Yockey) states that although
molecular biology must be consistent with chemical and physical processes,
biological principles cannot be derived from physics and chemistry alone.
This statement holds still more for the relationship between mathematics
and molecular biology (or any other branch of science)." Haken then makes
a coy remark attempting to leave the impression that the Huns are
assaulting the battlements of molecular biology:
"But Yockey repeatedly conveys the impression that the laws of molecular
biology can be derived from mathematics."

Haken's remark is non sense. What I actually stressed is that the laws of
molecular biology lie in the axioms from which one reasons, specifically
the sequence hypothesis and the genetic code. Mathematics is the highest
form of human reasoning about the consequences of those axioms.
Nevertheless, mathematics has limitations pointed out by Goedel and Turing
that lead to undecidability. For example, it is impossible to determine
whether a given computer program will halt.

Haken now makes an emotional accusation of heresy and lese majesty:x "So
the author's polemic against the seminal work of Manfred Eigen, who
clearly recognized and formulated the fundamental role of these dynamics,
is not only unfounded but misleading."

As Haken wrote in his first paragraph: "Quite often the word 'information'
is used with different meanings but from the very beginning he (Yockey)
sticks to a single interpretation-Shannon information." So far well and
good, but, rather subtlety, in his last paragraph, hoping the reader will
not notice, Haken uses 'information' in the sense of 'meaning' or
'knowledge'. "Yockey makes wild extrapolations in a futile attempt to show
how the classical concept of information theory can be applied to problems
in generating information." Haken has missed Section 12.1.2 where my "wild
extrapolations" are related to the well established concepts of
Kolmogorov-Chaitin algorithmic entropy.

I asked Professor Haken for a list of his publications. He gracefully
complied and proved to be an expert in lasers, quantum field theory of
solids and synergetics. The only publication I found on his list
pertaining to biology is: Entstehung von Biologischer Information und
Ordnung von Haken und Haken-Krell, herausgegegen Wissenschaftliche
Buchgesellschaft Darmstadt (1989). In English: Origin of Biological
Information and Order published by Scientific Book Company, Darmstadt
(1989).

A quotation from Chapter 8 will support the point I wish to make: First in
the original German; "Der Shannonische Informationbegriff sagt aber nichts
aus darueber, ob eine Nachricht sinnvoll oder sinnlos, wertvoll oder
wertlos ist, das heist, es geht ihm jeder Sinngehalt ab, oder, in anderen
Worten, es fehlt ihm die Semantik. Gerade in biologischen Bereich kann
dieses Fehlen ein wesentliches Manko bedeuten."

My English translation: "The Shannon information concept says nothing
about whether the message is meaningful or meaningless, valuable or
valueless, that is, it goes in every sense of the words, or in other words
semantics is lacking. In the field of biology this fault means a
substantial deficiency."

Early in the history of information theory, philosophers traded on the
word information and thought they had a mathematical means for dealing
with semantics, in spite of the Shannon's denial in the second paragraph
of his 1948 paper. See Bar-Hillel (1955) Philosophy of Science vol.22
pp86-105 and Bar-Hillel and Carnap (1953) British Journal of the
Philosophy of Scince volume 22 pp147-157. The reason this cannot be done
is that there is no mathematical measure for 'value' or 'meaning'. The
meaning of words depends on the language and the context. 'Meaning' cannot
be measured. As I pointed out in my book if you send a package labeled
Gift to Professors Eigen or Haken, you will violate German postal
regulations. Gift means poison in German.
As R. V. L. Hartley pointed out in 1928: "What I hope to accomplish in
this direction is to set up a quantitative measure whereby the capacities
of various systems to transmit information may be compared." Bell System
Technical Journal volume 7 pp535-563, (1928). Shannon carried Hartley's
plan forward and found the fundamental theorems of information theory and
coding theory and the quantitative measure in bits and bytes as we have it
today.
My 'polemic against the seminal work of Manfred Eigen' exposes his
confusion of philosophical notions of semantics and information measured
in bits as well as a number of other basic faults. Eigen feels free to
introduce conjectures cooked up ad hoc to suit each problem. One can solve
(sic) any problem with enough ad hoc conjectures. To remedy what he sees
as an inadequacy in "classical information theory" he calls for a purely
empirical "value parameter" that is characteristic of "valued
information". He states that this "valued information" is reflected by
increased "order". On the contrary, it is well known in information theory
that 'increased order' decreases the information content of a message.

Anyone who is computer literate knows that, in the context of computer
technology, the word information does not mean knowledge. Along with many
other authors, Eigen makes a play on words by using information in the
sense of knowledge, meaning and specificity. For example, in
Naturwissenschaften (1971) volume 58 465-523 (in English) he states with
reference to sequences in DNA that: "Such sequences cannot yet contain
any appreciable amount of information." He means knowledge or specificy.
Eigen uses the word 'information' in two different senses in one
sentence: "Information theory as we understand it today is more a
communication theory. It deals with problems of processing information
rather than of "generating" information."

Eigen purports to reinvent Shannon's Channel Capacity Theorem in order to
deal with an "error catastrophe" He does this by dealing with the errors
themselves rather than with information mutual entropy that Shannon proved
to be the correct concept. Eigen and all the Goettingen school are also
completely unaware of the Shannon-McMillan-Breiman theorem. Had anyone in
the Goettingen school finished reading Shannon's 1948 paper they would
have found both theorems.

Notice that Haken did not challenge my Chapter 10 on self-organization
directly but appealed to emotional charges of heresy and lese majesty.
When distinguished scientists are wrong they are just as wrong as the rest
of us. The principle that the king can do no wrong does not apply in
science. Idols have feet of clay.

Professor Avshalom C. Elitzur published a review in Contemporary Physics
volume 34 pages 275-278 (1993) Elitzur has trouble deciding whether he
likes the book or not. "This is mainly a critical book. The author
repeatedly warns against careless use of such concepts as 'information',
'order' and 'complexity'. xThis is a work that no one interested in the
fundamental relationship between physics and biology can ignore.x "There
is, still, much to commend in this book. The style is lively and often
humorous."x"Yockey's skepticism in itself is a good measure to
counterbalance the self-confidence of authors who believe that the secrets
of life's origin are already written down in their notes."

Elitzur does me the great compliment of quoting a comment from a
publication I made in Symposium on Information Theory in Biology published
in 1958 [before he was born?]

On the other hand, Elitzur, who is a professor in the Department of
Chemical Physics at The Weizmann Institute, in Rehovot Israel is alarmed
by my remark (page 313) x "xit is easy to see that thermodynamics has
nothing to do with Darwin's theory of evolution. Upon reading this
uncompromising statement the bewildered reader may recall several
discussions he or she has previously read concerning the apparent conflict
between the Second Law and biological order-growth." Elitzur calls the
Second Law of Thermodynamics an explanation of evolution.

The context in which this remark was made is in Section 12.1 where I
discussed the assertion made by creationists today and by critics of
Darwin in the nineteenth century, that there is a conflict between
evolution and the second law. Creationists say that, since the second law
cannot be challenged, Darwin's theory of evolution must be abandoned in
favor of special creation. Even a scientist as eminent as Eddington
believed there was such a conflict. Had Elitzur overcome his bewilderment
and read the next paragraph on page 313 he would have found the
explanation: "In fact, evolution requires an increase in
Kolmogorov-Chaitin algorithmic entropy of the genome in order to generate
the complexity necessary for higher organisms".

Elitzur has committed heresy by ignoring a statement by his mentors Eigen
and Schuster (1977) ( Eigen, M. and Schuster, P. (1977) The hypercycle: A
principle of natural self-organization. Part A. Emergence of the
hypercycle. Naturwissenschaften volume 64, 541-65) : "In physics we know
of principles which cannot be reduced to any more fundamental laws. As
axioms, they are abstracted from experience, their predictions being
consistent with the consequences that can be subjected to experimental
test. Typical examples are the first and second law of thermodynamics.
Darwin's principle of natural selection does not fall into the category of
first principles."

Elitzur is sometimes careless with his choice of quotations. He credits
the following passage to Shannon but it was written by Weaver, not
Shannon:
"In fact, I suspect that Yockey's reproach reaches the founder of
information theory himself: Shannon (Shannon and Weaver 1963), having
quoted Eddington's famous remark about the Second Law's supreme position,
went on to muse that:
'thus when one meets the concept of entropy in communication theory, one
has a right to be excited-a right to expect that one has hold of something
that may turn out to be very basic and important.' This enthusiasm clearly
indicates that Shannon did not think the classical concept of entropy to
be much different from his own."
Shannon was quite clear on the point that entropy in statistical mechanics
is a concept different from entropy in information theory.
Elitzur is confused about the relation between classical thermodynamics,
statistical mechanics and information theory. He uses thermodynamics and
statistical mechanics interchangeably. "It has often been pointed out that
thermodynamics is a substance-independent theory that holds for all
systems irrespective of their structure, chemical constituents or specific
form of energy. Therefore, so goes the argument, thermodynamics provides
the most promising prospect for a physical foundation of biology."
Chemists and chemical engineers will be astonished to learn that they have
wasted their time preparing tables of entropy, specific heats etc. for
various substances.

Classical thermodynamics deals with heat and work directed to the
operation of heat engines and is independent of the existence of atoms. In
direct contrast, statistical mechanics depends on the existence and
properties of atoms and applies the Newtonian laws of mechanics and the
theory of probability to the interaction of atoms and the solution of
problems of heat, heat capacity, etc. Specifically, the concept of entropy
in classical thermodynamics is different from that in statistical
mechanics and from that in information theory.

Elitzur is dismayed by my finding (Yockey, 1974, 1977) that there is no
relation between Shannon entropy in information theory and
Maxwell-Boltzmann-Gibbs entropy in statistical mechanics. Let us see why
that is so. According to modern theory of probability one cannot speak of
a probability without first establishing a probability space and setting
up a probability distribution of the random variables, appropriate to the
problem. The axioms of probability theory must be satisfied in order to
avoid a "Dutch book" and to be sure that we are not using knowledge we do
not have (Yockey, 1992, p20-33)

The probability space in statistical mechanics, called phase space by
theoretical physicists, is six dimensional and is defined by the position
and momentum vectors of the ensemble of particles. The values of these
position and momentum vectors are random variables and the pi form
probability vectors referring to particle i. The function for entropy in
statistical mechanics, S, has the dimensions of the Boltzmann constant k
and has to do with energy, not information. Shannon entropy has no thermal
or mechanical dimensions.

Information theory is concerned with messages expressed in sequences of
letters selected from a finite alphabet by a Markov process. The
probability space is defined by the letters of the alphabet under
consideration, which are random variables and the pi form probability
vectors accordingly.

To illustrate this point further, one may consider the probability space
of a dice game that consists of the numbers 2 through 12 and calculate the
corresponding entropy (Yockey, see exercise on page 88). Clearly, the
entropy of a dice game has nothing to do with statistical mechanics and
thermodynamics. It may have something to do with information theory since
a sequence of letters selected from the alphabet is generated as a Markov
process by a series of tosses of the dice. Such a sequence of letters
forms a message in which some gamblers find meaning. For these reasons
entropy in statistical mechanics and entropy in information theory are
different concepts that have no relationship that enables us to make an
equivalence of one to the other.

Elitzur gets off a number of bloopers: "Information theory, according to
Yockey, 'shows that it is fundamentally undecidable whether a given
sequence has been generated by a stochastic process or by a highly
organized process' (p82). This must be an amazing statement for anyone
familiar with the basic concepts of information theory where information
is defined as the very opposite of randomness."

'Information' is, of course, not the very opposite of randomness. Elitzur
is using the word 'information' in the semantic sense as synonym for
knowledge or meaning. Everyone knows that a random sequence, that is, one
chosen without intersymbol restrictions or influence, carries the most
information in the sense use by Shannon and in computer technology. Note:
For a brief explanation of randomness, complexity, order and information
see Yockey Nature 344 p823 (1990).

"The author's views also seem to be based on a peculiar approach to the
philosophy of science. He recommends completely cleaning the field of all
paradigms (p 336) in contrast to the conventional wisdom that a loose
paradigm may prove better than no paradigm at all." Apparently professors
like loose paradigms so they have something to put in final examinations!

Professor Shneior Lifson published a review in BioEssays vol. 16 pages
373-375 (1994). Lifson, also a professor at the Chemical Physics
Department Weizmann Institute, found it difficult to decide whether he
liked the article.

"However, the purpose of this monograph reaches much further . [Than
application of Kolmogorov- Chaitin algorithmic information theory to
molecular biology] Information theory is, as explained in the prolog, the
mathematical foundation of molecular biology. Indeed, Yockey unfolds in
the monograph a comprehensive self-consistent philosophy of molecular
biology. It is this philosophy and its way of presentation that intrigued
me throughout the book. I was fascinated by the author's rich style and
broad erudition. Furthermore, I was stimulated by his thought provoking
arguments against the conventional methodology of science in general and
of science related to the origin of life in particular."

However, when Lifson collides with the results of the application of
information theory and coding theory he finds only heresy and lese
majesty. Both are capital offenses in the groves of academe. He can not
bring himself to accept the fact that information theory does not address
meaning nor does it have a measure for meaning. In this he follows Eigen,
Haken and his colleague Elitzur. He has the same problem as Haken with
regard to the relation of mathematics and theoretical physics, theoretical
chemistry and theoretical molecular biology. He accuses me of vitalism
whereas I specifically show why vitalism is both wrong and unnecessary. I
was especially amused by his consternation at my reference to the Grand
Academy of Lagado in Jonathan Swift's Gulliver's Travels and my comparison
of Eigen and Schuster as well as Dawkins to the savants at that
distinguished institution. Remember that Lifson was fascinated by my
---broad erudition.

A reply to Lifson's book review has been accepted by BioEssays and is now
in press. I shall not comment on Lifson's review further in this note.

Of course there are controversial conclusions in the book. If there
weren't there would have been no point in publishing it. I think the most
important contribution to practical molecular biology is in Chapter 7 on
the evolution of the genetic code. I began with the proposal that the
original genetic code was a doublet code in which the third nucleotide was
silent and which coded for only eight or so amino acids. As the code
evolved by enlarging its vocabulary coding theory shows that the number of
possible codes decreases and finally must pass through a bottleneck when
there are 14 or 15 amino acids in the vocabulary. The vocabulary could
then be enlarged to 20 only by applying specificty to the third
nucleotide. By analyzing the evolution of such codes by means of coding
theory I showed that the existence of mitochondrial and other codes that
differ in a few assignments from the standard genetic code is a required
consequence of that evolution.

The question of the separate genetic codes in the mitochondria is usually
sloughed off as a trivial curiosity. On the contrary, I suggest that
Nature is trying to tell us something if we will only listen. This
discussion is a qualitative review of what I have dealt with in full
industrial strength in Information and Molecular Biology.

This book is a pivotal publication for all work on theoretical molecular
biology and belongs in the library of everyone interested in molecular
biology. Also as Elitzur said: " This is a work that no one interested in
the fundamental relationship between physics and biology can ignore."

Give me your opinion but after you read the book!
==================================================================

========================
Brian Harper |
Associate Professor | "It is not certain that all is uncertain,
Applied Mechanics | to the glory of skepticism" -- Pascal
Ohio State University |
========================