We need to bring in quantum information theory. Shannon's noisy
coding theorem notes that the bandwidth of a channel decreases with
noise. In the quantum variant, the bandwidth of a channel may
increase with noise! It also means that quantum information can be
negative! Take away point. Information theory is not necessarily
intuitive.
Reference: http://arxiv.org/abs/quant-ph/0505062
> Given an unknown quantum state distributed over two systems, we
> determine how much quantum communication is needed to transfer the
> full state to one system. This communication measures the "partial
> information" one system needs conditioned on it's prior
> information. It turns out to be given by an extremely simple
> formula, the conditional entropy. In the classical case, partial
> information must always be positive, but we find that in the
> quantum world this physical quantity can be negative. If the
> partial information is positive, its sender needs to communicate
> this number of quantum bits to the receiver; if it is negative, the
> sender and receiver instead gain the corresponding potential for
> future quantum communication. We introduce a primitive "quantum
> state merging" which optimally transfers partial information. We
> show how it enables a systematic understanding of quantum network
> theory, and discuss several important applications including
> distributed compression, multiple access channels and multipartite
> assisted entanglement distillation (localizable entanglement).
> Negative channel capacities also receive a natural interpretation.
>
More explanation here: http://www.nature.com/nature/journal/v436/
n7051/full/436633a.html
> Claude Shannon's landmark 1948 theory of communication1 tackles a
> nuts-and-bolts question: how do we find the best way to communicate
> using a given resource, such as a telegraph line or a satellite
> antenna? To answer that question, Shannon first took a detour into
> more philosophical territory by working out how to quantify the
> elusive concepts 'uncertainty' and 'information'. More than half a
> century on, quantum-information theorists have in many ways taken
> the opposite approach. Inspired by Shannon, but working with the
> notoriously counterintuitive theory of quantum mechanics, they seek
> to understand uncertainty and information in the quantum world by
> analysing the practical questions first, in the hope that the
> answers might then illuminate more fundamental conceptual issues.
>
> On page 673 of this issue2, Horodecki, Oppenheim and Winter
> demonstrate how effective this approach can be by justifying, in
> operational terms, a definition of conditional uncertainty that had
> previously been widely rejected owing to its strange and apparently
> nonsensical properties. In their formulation, what had been
> pathological becomes profound: with quantum information, it is
> possible not just to be certain, but to be more than certain.
On Apr 9, 2007, at 8:39 PM, Randy Isaac wrote:
> Bill,
> All quibbling is welcome! Quibble away. These concepts aren't
> easy to understand and I'm a long way from understanding it. I may
> well be way off base in my understanding of it. I'm more than happy
> to get in touch with the information science group back in Watson
> to get clarification on any of this. I'm just trying to convey what
> I have learned. Or think I have.
>
> It is true that the genome resembles a computer program to some
> extent. Actually I think we impose that resemblance in our attempt
> to understand the genome. It is human nature to use familiar
> concepts to interpret new observations.
>
> The conveyance of instructions or data is a result of the
> replication process. That is indeed a marvel of creation.
> Understanding the origin of self-replication is tantamount to
> understanding the origin of life. But that doesn't determine what
> is information and what is complexity. By the way, I'm not sure
> it's proper to say "the genome contains complexity" in the same way
> that we would say a "memory chip contains information." I would
> rather say that the genome is characterized by a complexity that is
> transferred in two ways. One is in its entirety during cell
> replication. The other is piecemeal during protein formation. We
> commonly use the term "information" to characterize what is
> transmitted. That's ok but just realize that this is a different
> type of "information" than Shannon-information. The same rules
> don't apply.
>
> Yes, k is Boltzmann's constant and T is the temperature of the
> system that embodies the information.
>
> Randy
> ----- Original Message -----
> From: Bill Hamilton
> To: Randy Isaac ; asa@calvin.edu
> Sent: Monday, April 09, 2007 9:11 AM
> Subject: Re: [asa] Information and knowledge
>
> Randy wrote
>
> This also leads to an important observation on the 'information' in
> the genome. Charles Bennett once gently corrected me, saying that
> technically, the more accurate term is 'complexity' not
> 'information.' The genetic code conveyed from one cell to its
> replicated cell is not 'information' as Shannon described. This
> 'information' is not independent of its physical embodiment. The
> physical embodiment IS the information. It is never converted from
> one medium to another. This is really complexity, not information.
> The supposed notions of conservation of information don't apply to
> the genetic code. It is not a message conveyed from one agent to
> another. Information about the genome and its sequence of course is
> classical information.
>
> I have to quibble with this. Evidently this distinction depends on
> the three points you stated at the beginning of your post:
>
> 1. Information is physical
> 2. Information is independent of its physical embodiment
> 3. Erasing one bit of information dissipates at least kT/2 of energy
>
> However, these points seem to leave out what information _is_:
> symbols organized in such a way that they convey instructions or
> data. To say that the genome contains complexity instead of
> information leaves out (IMO) this characteristic. The genome
> resembles a computer program that specifies an algorithm for
> building cellular structures. And that to me is something more
> specific than complexity.
>
> BTW, in point 3 I presume k is Boltzmann's constant. Is T the
> temperature? And if so what temperature does it represent? Some
> latent heat required to record the bit? The ambient temp? ...?
>
> Bill Hamilton
> William E. Hamilton, Jr., Ph.D.
> 248.652.4148 (home) 248.821.8156 (mobile)
> "...If God is for us, who is against us?" Rom 8:31
>
>
> ----- Original Message ----
> From: Randy Isaac <randyisaac@comcast.net>
> To: asa@calvin.edu
> Sent: Sunday, April 8, 2007 7:17:50 PM
> Subject: [asa] Information and knowledge
>
> Having finally worked my way through the mountain of posts from the
> last three weeks, I'd like to comment on a couple. There were
> several references to information, its mass and energy, and its
> relationship to the genome. Maybe we should remind ourselves of
> some of the fundamental principles of information.
>
> Claude Shannon was the key pioneer of information theory. Rolf
> Landauer may have done the most to turn it into a bona fide hard
> science. Charles Bennett has been a leader in moving Shannon's
> ideas in the classical realm to the exotic world of quantum theory.
>
> Landauer made a number of key observations:
>
> 1. Information is physical
> 2. Information is independent of its physical embodiment
> 3. Erasing one bit of information dissipates at least kT/2 of energy
>
>
> The first point indicates that without mass or energy, there is no
> information. How much mass is there in information? The old joke is
> that "my briefcase is so heavy because I downloaded so many books
> onto my hard drive." This confuses two types of information--the
> message or meaning that is conveyed vs the basic binary bits
> underlying that information. A 80GB hard drive contains the same
> number of bits no matter what is downloaded. They just aren't all
> intelligible until we rearrange them.
>
> The second point is easily visualized by thinking of a telephone
> conversation. As the information passes from the mind of person A
> to the mind of person B, the physical medium that conveys the
> information changes many times. The information doesn't.
>
> This also leads to an important observation on the 'information' in
> the genome. Charles Bennett once gently corrected me, saying that
> technically, the more accurate term is 'complexity' not
> 'information.' The genetic code conveyed from one cell to its
> replicated cell is not 'information' as Shannon described. This
> 'information' is not independent of its physical embodiment. The
> physical embodiment IS the information. It is never converted from
> one medium to another. This is really complexity, not information.
> The supposed notions of conservation of information don't apply to
> the genetic code. It is not a message conveyed from one agent to
> another. Information about the genome and its sequence of course is
> classical information.
>
> Randy
>
>
>
>
>
> Don't get soaked. Take a quick peek at the forecast
> with theYahoo! Search weather shortcut.
>
To unsubscribe, send a message to majordomo@calvin.edu with
"unsubscribe asa" (no quotes) as the body of the message.
Received on Tue Apr 10 00:00:09 2007
This archive was generated by hypermail 2.1.8 : Tue Apr 10 2007 - 00:00:10 EDT