Shannon's source coding theorem

In information theory, Shannon's source coding theorem (or noiseless coding theorem) establishes the limits to possible data compression, and the operational meaning of the Shannon entropy. Named after Claude Shannon, the source coding theorem shows that (in the limit, as the length of a stream of independent and identically-distributed random variable (i.i.d.) data tends to infinity) it is impossible to compress the data such that the code rate (average number of bits per symbol) is less than the Shannon entropy of the source, without it being virtually certain that information will be lost. However it is possible to get the code rate arbitrarily close to the Shannon entropy, with negligible probability of loss.

Comment
enIn information theory, Shannon's source coding theorem (or noiseless coding theorem) establishes the limits to possible data compression, and the operational meaning of the Shannon entropy. Named after Claude Shannon, the source coding theorem shows that (in the limit, as the length of a stream of independent and identically-distributed random variable (i.i.d.) data tends to infinity) it is impossible to compress the data such that the code rate (average number of bits per symbol) is less than the Shannon entropy of the source, without it being virtually certain that information will be lost. However it is possible to get the code rate arbitrarily close to the Shannon entropy, with negligible probability of loss.
Has abstract
enIn information theory, Shannon's source coding theorem (or noiseless coding theorem) establishes the limits to possible data compression, and the operational meaning of the Shannon entropy. Named after Claude Shannon, the source coding theorem shows that (in the limit, as the length of a stream of independent and identically-distributed random variable (i.i.d.) data tends to infinity) it is impossible to compress the data such that the code rate (average number of bits per symbol) is less than the Shannon entropy of the source, without it being virtually certain that information will be lost. However it is possible to get the code rate arbitrarily close to the Shannon entropy, with negligible probability of loss. The source coding theorem for symbol codes places an upper and a lower bound on the minimal possible expected length of codewords as a function of the entropy of the input word (which is viewed as a random variable) and of the size of the target alphabet.
Is primary topic of
Shannon's source coding theorem
Label
enShannon's source coding theorem
Link from a Wikipage to another Wikipage
Asymptotic equipartition property
Bit
Category:Articles containing proofs
Category:Coding theory
Category:Data compression
Category:Information theory
Category:Mathematical theorems in theoretical computer science
Category:Presentation layer protocols
Channel coding
Claude Shannon
Data compression
Differential entropy
Entropy
Entropy (information theory)
Error exponent
Expected value
Gibbs' inequality
Independent and identically distributed random variables
Independent identically-distributed random variables
Information theory
Kleene star
Kraft's inequality
Noisy-channel coding theorem
Random variable
Shannon entropy
Time series
Typical set
Variable-length code
SameAs
2GoN9
m.04hf4v
Podstawowe twierdzenie Shannona
Primer teorema de Shannon
Primer teorema de Shannon
Primo teorema di Shannon
Q2411312
Shannon's source coding theorem
Teorema de codificação da fonte
Theorem codio ffynhonnell Shannon
Théorème du codage de source
Прва Шенонова теорема
Прва Шенонова теорема
Теорема Шеннона об источнике шифрования
قضیه کدگذاری منبع شانون
シャノンの情報源符号化定理
信源编码定理
Subject
Category:Articles containing proofs
Category:Coding theory
Category:Data compression
Category:Information theory
Category:Mathematical theorems in theoretical computer science
Category:Presentation layer protocols
WasDerivedFrom
Shannon's source coding theorem?oldid=1089224777&ns=0
WikiPageLength
11028
Wikipage page ID
1208872
Wikipage revision ID
1089224777
WikiPageUsesTemplate
Template:!
Template:=
Template:About
Template:Information theory
Template:Math
Template:Mvar
Template:Overline
Template:Reflist
Template:Short description
Template:Su