Information theory is mathematical theory that considers the measurement and the process of information. Information theory was established by Claude Shannon and includes the source coding theorem, the channel coding theorem and the channel capacity theorem.
We assume is a probability of th event when the set of total events is denoted as M.
Self-information with above assumption is given by
where is a probability of the th event and .
Entropy of a set of total events is given by
where is the expectation operator.