FANDOM


Information, or self-information, in information theory is a basic element to measure the amount of information quantitatively. As an example, information can be measured by the unit of bit as follows


I(X) = \log_2 \left( \frac{1}{p(X)} \right)

where X is the event, p(X) is the probability of event X. If the event is an unexpected or less expected to happen, the amount of information is higher while the frequently happened events have low values of I(X).

See alsoEdit

Ad blocker interference detected!


Wikia is a free-to-use site that makes money from advertising. We have a modified experience for viewers using ad blockers

Wikia is not accessible if you’ve made further modifications. Remove the custom ad blocker rule(s) and the page will load as expected.