## FANDOM

186 Pages

Entropy is one of metric to measure the amount of information, which is mathematically formulated as

$H(X) = E \left[ p(X)\log_2 \frac{1}{p(X)} \right]$

where $X$ is information source and $p(X)$ is the probability function of $X$.