Binary entropy function

Binary entropy function

In information theory, the binary entropy function, denoted or , is defined as the entropy of a Bernoulli process with probability of one of two values. It is a special case of , the entropy function. Mathematically, the Bernoulli trial is modelled as a random variable that can take on only two values: 0 and 1, which are mutually exclusive and exhaustive. If , then and the entropy of (in shannons) is given by , where is taken to be 0. The logarithms in this formula are usually taken (as shown in the graph) to the base 2. See binary logarithm.

Comment
enIn information theory, the binary entropy function, denoted or , is defined as the entropy of a Bernoulli process with probability of one of two values. It is a special case of , the entropy function. Mathematically, the Bernoulli trial is modelled as a random variable that can take on only two values: 0 and 1, which are mutually exclusive and exhaustive. If , then and the entropy of (in shannons) is given by , where is taken to be 0. The logarithms in this formula are usually taken (as shown in the graph) to the base 2. See binary logarithm.
Depiction
Binary entropy plot.svg
Has abstract
enIn information theory, the binary entropy function, denoted or , is defined as the entropy of a Bernoulli process with probability of one of two values. It is a special case of , the entropy function. Mathematically, the Bernoulli trial is modelled as a random variable that can take on only two values: 0 and 1, which are mutually exclusive and exhaustive. If , then and the entropy of (in shannons) is given by , where is taken to be 0. The logarithms in this formula are usually taken (as shown in the graph) to the base 2. See binary logarithm. When , the binary entropy function attains its maximum value. This is the case of an unbiased coin flip. is distinguished from the entropy function in that the former takes a single real number as a parameter whereas the latter takes a distribution or random variable as a parameter.Sometimes the binary entropy function is also written as .However, it is different from and should not be confused with the Rényi entropy, which is denoted as .
Is primary topic of
Binary entropy function
Label
enBinary entropy function
Link from a Wikipage to an external page
web.archive.org/web/20160217105359/http:/www.inference.phy.cam.ac.uk/mackay/itila/book.html
Link from a Wikipage to another Wikipage
Bernoulli process
Binary logarithm
Category:Entropy and information
David J. C. MacKay
Derivative
Entropy (information theory)
Fair coin
File:Binary entropy plot.svg
Information entropy
Information theory
Logit
Metric entropy
Parameter
Probability
Quantities of information
Random variable
Rényi entropy
Shannon (unit)
Taylor series
SameAs
4Yf9v
Entropia binarna
m.0dc1gm
Q4913893
二値エントロピー関数
Subject
Category:Entropy and information
Thumbnail
Binary entropy plot.svg?width=300
WasDerivedFrom
Binary entropy function?oldid=1071507954&ns=0
WikiPageInterLanguageLink
二元熵函數
WikiPageLength
4344
Wikipage page ID
5275277
Wikipage revision ID
1071507954
WikiPageUsesTemplate
Template:ISBN
Template:Reflist