Algorithmic Complexity [2/2]

Brian D. Harper (bharper@magnus.acs.ohio-state.edu)
Tue, 23 Jan 1996 15:43:44 -0500

Now, it may seem at first that the above observations call into
question our previous discussion about the probability of achieving
a highly compressible (ordered) result. It is true that the
algorithmic complexity can be measured without any knowledge of the
process used to generate the result, however, we can still compute
probabilities of various outcomes if we do happen to have some
_a-priori_ knowledge about the process. In all the previous examples
we were assuming _a-priori_ that the stochastic process had two
equi-probable outcomes (i.e. we were discussing things in terms of
the expected outcome of tossing a fair coin). The important thing
to learn from this is that probability calculations are likely to be
completely erroneous if the _a-priori_ probabilities of the various
possible outcomes are not known. This becomes particularly significant
since in practice it is quite common, when detailed knowledge of
probabilities is unknown, to merely assume that all possibilities
occur with equal probability. Hopefully this example will convince
the reader that probabilities thus calculated can lead one to
incorrect conclusions.

<< will probably need something further here emphasizing that the
previous counting argument still applies when the two possibilities
are not equi-probable, i.e. the counting argument doesn't require
_a-priori_ knowledge of the process. it just occurred to me that this
may not be obvious. :) What also may not be obvious is that for the
ten faced die experiment, the total number of possible results is
still 2^64 and not 10^64. Also, as might be anticipated, this last
example has important implications regarding probability of abiogenesis
calculations ;-)>>

NOTES==========================================

1. I have NO idea what a _Sachertorte_ is ;-).

REFERENCES=====================================

Casti, John L. (1990). <Searching for Certainty: What Scientists can Know
About the Future>, William Morrow and Company, New York.

Casti, John L. (1994). <Complexification: Explaining a Paradoxical World
Through the Science of Surprise>, HarperCollins, New York.

Chaitin, G.J. (1966). "On the Length of Programs for Computing Finite Binary
Sequences," <Journal of the Association for Computing Machinery> 13:547-69.

Chaitin, G.J. (1970). "To a Mathematical Definition of Life," <ACM SICACT
News> 4:12-18.

Chaitin, G.J. (1975). "Randomness and Mathematical Proof," <Scientific
American>, 232 (May 1975):47-52.

Chaitin, G.J. (1991) "Complexity and Randomness in Mathematics,"
<"Pensar la Complexitat" Symposium>, Barcelona, 4 November 1991. Reprinted
in Chaitin (1992), pp. 189-203.

Chaitin, G.J. (1992) <Information-Theoretic Incompleteness>, World
Scientific, Singapore.

Cover, T. and J. A. Thomas (1991). <Elements of Information Theory>, John
Wiley and Sons.

Li, M. and P. Vitanyi (1993). <An Introduction to Kolmogorov
Complexity and its Applications>, Springer-Verlag.

Solomonoff, R.J. (1964). "A Formal Theory of Inductive Inference. Part I,"
<Information and Control> 7:1-22.
========================================================================
========================
Brian Harper |
Associate Professor | "It is not certain that all is uncertain,
Applied Mechanics | to the glory of skepticism" -- Pascal
Ohio State University |
========================