Information theory is mathematical theory that considers the measurement and the process of information. Information theory was established by Claude Shannon and includes the source coding theorem, the channel coding theorem and the channel capacity theorem^{[1]}.

## EntropyEdit

We assume is a probability of th event when the set of total events is denoted as M.

**Self-information** with above assumption is given by

where is a probability of the th event and .

**Entropy** of a set of total events is given by

where is the expectation operator.

## ReferencesEdit

- ↑ C.E. Shannon, A Mathematical Theory of Communication, Bell System Technical Journal, vol. 27, pp. 379-423, 623-656, July, October, 1948