Entr 5
Entr 5
The concept of information entropy was introduced by Claude Shannon in his 1948
paper "A Mathematical Theory of Communication",[2][3] and is also referred to
as Shannon entropy. Shannon's theory defines a data communication system
composed of three elements: a source of data, a communication channel, and a
receiver. The "fundamental problem of communication" – as expressed by Shannon – is
for the receiver to be able to identify what data was generated by the source, based on
the signal it receives through the channel.[2][3] Shannon considered various ways to
encode, compress, and transmit messages from a data source, and proved in
his source coding theorem that the entropy represents an absolute mathematical limit
on how well data from the source can be losslessly compressed onto a perfectly
noiseless channel. Shannon strengthened this result considerably for noisy channels in
his noisy-channel coding theorem.
Introduction
[edit]
The core idea of information theory is that the "informational value" of a communicated
message depends on the degree to which the content of the message is surprising. If a
highly likely event occurs, the message carries very little information. On the other hand,
if a highly unlikely event occurs, the message is much more informative. For instance,
the knowledge that some particular number will not be the winning number of a lottery
provides very little information, because any particular chosen number will almost
certainly not win. However, knowledge that a particular number will win a lottery has
high informational value because it communicates the occurrence of a very low
probability event.