Re: 2nd Law of Thermodynamics

Bill Hamilton (hamilton@predator.cs.gmr.com)
Tue, 20 Jan 1998 16:19:21 -0500

At 12:40 PM -0500 1/20/98, David Bowman wrote:
>I too would like to butt in here with the observation that both of the
>two entropies listed above are really just two different applications of
>the *same* underlying general abstract concept applied to two different
>situations. The entropies of Gibbs and of Shannon are both special cases
>of the entropy concept of Bayesian probability theory. That is, entropy
>is a measure of the uncertainty about the outcome of a random process
>characterized by a given probability distribution. Each probability
>distribution possesses an entropy and entropy is a functional on the space
>of probability distributions, and, as such, is, therefore, a statistic.

David: Thanks for posting this. There's a lot of confusion about the
meaning of entropy, and I count myself one of the confused. Could you
recommend books or papers for further reading?

Bill Hamilton
--------------------------------------------------------------------------
William E. Hamilton, Jr, Ph.D. | Staff Research Engineer
Chassis and Vehicle Systems | General Motors R&D Center | Warren, MI
William_E._Hamilton@notes.gmr.com
810 986 1474 (voice) | 810 986 3003 (FAX) | whamilto@mich.com (home email)