# Mutual Information Mutual information measure how independent two events rare. $ \begin{aligned} I(X ; Y) & =\sum_{x, y} P(x, y) \log \frac{P(x, y)}{P(x) P(y)} \\ & =\mathrm{E}\left[\log \frac{1}{P(x)}+\log \frac{1}{P(y)}-\log \frac{1}{P(x, y)}\right] \\ & =H(X)+H(Y)-H(X, Y) .\end{aligned}$ --- ## References