## FANDOM

186 Pages

Information, or self-information, in information theory is a basic element to measure the amount of information quantitatively. As an example, information can be measured by the unit of bit as follows

$I(X) = \log_2 \left( \frac{1}{p(X)} \right)$

where $X$ is the event, $p(X)$ is the probability of event $X$. If the event is an unexpected or less expected to happen, the amount of information is higher while the frequently happened events have low values of $I(X)$.