FANDOM


Information theory is mathematical theory that considers the measurement and the process of information. Information theory was established by Claude Shannon and includes the source coding theorem, the channel coding theorem and the channel capacity theorem[1].

EntropyEdit

We assume p(m) is a probability of mth event when the set of total events is denoted as M.

Self-information with above assumption is given by

I(m) = \log \left( \frac{1}{p(m)} \right) = - \log( p(m))

where p(m) is a probability of the mth event and \sum_{m \in M} p(m) = 1.

Entropy of a set of total events is given by

 H(M) = E[ I(M)] = \sum_{m \in M} p(m) I(m) = - \sum_{m \in M} p(m) \log p(m)

where E[] is the expectation operator.

ReferencesEdit

  1. C.E. Shannon, A Mathematical Theory of Communication, Bell System Technical Journal, vol. 27, pp. 379-423, 623-656, July, October, 1948

Ad blocker interference detected!


Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.