Entropy coding

In information theory, an entropy coding (or entropy encoding) is any lossless data compression method that attempts to approach the lower bound declared by Shannon's source coding theorem, which states that any lossless data compression method must have expected code length greater or equal to the entropy of the source. Since 2014, data compressors have started using the asymmetric numeral systems family of entropy coding techniques, which allows combination of the compression ratio of arithmetic coding with a processing cost similar to Huffman coding.

Comment
enIn information theory, an entropy coding (or entropy encoding) is any lossless data compression method that attempts to approach the lower bound declared by Shannon's source coding theorem, which states that any lossless data compression method must have expected code length greater or equal to the entropy of the source. Since 2014, data compressors have started using the asymmetric numeral systems family of entropy coding techniques, which allows combination of the compression ratio of arithmetic coding with a processing cost similar to Huffman coding.
Has abstract
enIn information theory, an entropy coding (or entropy encoding) is any lossless data compression method that attempts to approach the lower bound declared by Shannon's source coding theorem, which states that any lossless data compression method must have expected code length greater or equal to the entropy of the source. More precisely, the source coding theorem states that for any source distribution, the expected code length satisfies , where is the number of symbols in a code word, is the coding function, is the number of symbols used to make output codes and is the probability of the source symbol. An entropy coding attempts to approach this lower bound. Two of the most common entropy coding techniques are Huffman coding and arithmetic coding.If the approximate entropy characteristics of a data stream are known in advance (especially for signal compression), a simpler static code may be useful.These static codes include universal codes (such as Elias gamma coding or Fibonacci coding) and Golomb codes (such as unary coding or Rice coding). Since 2014, data compressors have started using the asymmetric numeral systems family of entropy coding techniques, which allows combination of the compression ratio of arithmetic coding with a processing cost similar to Huffman coding.
Is primary topic of
Entropy coding
Label
enEntropy coding
Link from a Wikipage to an external page
www.inference.phy.cam.ac.uk/mackay/itila/book.html
iphome.hhi.de/wiegand/assets/pdfs/VBpart1.pdf
Link from a Wikipage to another Wikipage
Arithmetic coding
Asymmetric numeral systems
Asymmetric Numeral Systems
Category:Entropy and information
Category:Lossless compression algorithms
Claude Shannon
Context-adaptive binary arithmetic coding
Data stream
David MacKay (scientist)
Elias gamma coding
Fibonacci coding
Golomb coding
Huffman coding
Information theory
Lossless compression
Range coding
Rice coding
Signal compression
Similarity measure
Source coding theorem
Statistical classification
Thomas Wiegand
Unary coding
Universal code (data compression)
SameAs
4743864-2
Codage entropique
Codificació entròpica
Codificación entrópica
Codificazione entropica
Entropiekodierung
MggE
Q1345239
Ентропійне кодування
Энтропийное кодирование
ترميز بالاعتلاج
کدگذاری آنتروپی
エントロピー符号
熵編碼法
엔트로피 부호화
Subject
Category:Entropy and information
Category:Lossless compression algorithms
WasDerivedFrom
Entropy coding?oldid=1118802006&ns=0
WikiPageLength
3953
Wikipage page ID
46680
Wikipage revision ID
1118802006
WikiPageUsesTemplate
Template:Compression Methods
Template:More footnotes
Template:Reflist
Template:Short description