Main Page | Recent changes | Edit this page | Page history

Printable version | Disclaimers | You have new messages.

Not logged in
Log in | Help
 
Other languages: Deutsch | Español | Français | Nederlands | 日本語 | Polski | 中文

Information theory

From Wikipedia, the free encyclopedia.

Information theory is a branch of the mathematical theory of probability and mathematical statistics, that deals with the concepts of information and information entropy, communication systems, data transmission and rate distortion theory, cryptography, signal-to-noise ratios, data compression, error correction, and related topics. It is not to be confused with library and information science or information technology.

Claude E. Shannon (1916-2001) has been called "the father of information theory". His theory for the first time considered communication as a rigorously stated mathematical problem in statistics and gave communications engineers a way to determine the capacity of a communication channel in terms of the common currency of bits. The transmission part of the theory is not concerned with the meaning (semantics) of the message conveyed, though the complementary wing of information theory concerns itself with content through lossy compression of messages subject to a fidelity criterion.

These two wings of information theory are joined together and mutually justified by the information transmission theorems, or source-channel separation theorems that justify the use of bits as the universal currency for information in many contexts.

It is generally accepted that the modern discipline of information theory began with the publication of Shannon's article "The Mathematical Theory of Communication" in the Bell System Technical Journal in July and October of 1948. This work drew on earlier publications by Harry Nyquist and Ralph Hartley. In the process of working out a theory of communications that could be applied by electrical engineers to design better telecommunications systems, Shannon defined a measure of entropy:

H = - \sum_{i}   p_{i}   \log p_{i} \,

that, when applied to an information source, could determine the capacity of the channel required to transmit the source as encoded binary digits. If the logarithm in the formula is taken to base 2, then it gives a measure of entropy in bits. Shannon's measure of entropy came to be taken as a measure of the information contained in a message, as opposed to the portion of the message that is strictly determined (hence predictable) by inherent structures, like for instance redundancy in the structure of languages or the statistical properties of a language relating to the frequencies of occurrence of different letter or word pairs, triplets etc. See Markov chains.

Entropy as defined by Shannon is closely related to entropy as defined by physicists. Boltzmann and Gibbs did considerable work on statistical thermodynamics. This work was the inspiration for adopting the term entropy in information theory. There are deep relationships between entropy in the thermodynamic and informational senses. For instance, Maxwell's demon needs information to reverse thermodynamic entropy and getting that information exactly balances out the thermodynamic gain that the demon would otherwise achieve.

Among other useful measures of information is mutual information, a measure of the correlation between two random variables. Mutual information is defined for two events X and Y as

I(X, Y) = H(X) + H(Y) - H(X, Y) = H(X) - H(X|Y) = H(Y) - H(Y|X)\,

where H(X,Y) is the joint entropy, defined as

H(X, Y) = - \sum_{x, y} p(x, y) \log p(x, y) \,

and H(X | Y) is the conditional entropy of X conditioned on observing Y. As such, the mutual information can be intuitively considered the amount of uncertainty in X that is eliminated by observations of Y and vice versa.

Mutual information is closely related to the log-likelihood ratio test for multinomials and to Pearson's χ2 test.

A. N. Kolmogorov introduced an information measure that is based on the shortest algorithm that can recreate it; see Kolmogorov complexity.

See also

References

External links


[Main Page]
Main Page
Recent changes
Random page
Current events
Community Portal

Edit this page
Discuss this page
Page history
What links here
Related changes

Special pages
Contact us
Donations