- The running time of an algorithm:
$T(n) = O(f(n))$ - The space complexity of an algorithm:
$S(n) = O(f(n))$ - The Shannon entropy:
$H = -\sum_{i=1}^{n} p_i \log_2 p_i$ - The mutual information:
$I(X;Y) = \sum_{y \in Y} \sum_{x \in X} p(x,y) \log_2 \frac{p(x,y)}{p(x)p(y)}$ - The Kullback-Leibler divergence:
$D_{KL}(P||Q) = \sum_{i} P(i) \log_2 \frac{P(i)}{Q(i)}$ - The Bayes theorem:
$P(A|B) = \frac{P(B|A)P(A)}{P(B)}$ - The probability of an event:
$P(E) = \frac{n(E)}{n(S)}$ - The expected value:
$E[X] = \sum_{i=1}^{n} x_i p_i$ - The variance:
$Var(X) = E[(X - \mu)^2]$ - The covariance:
$Cov(X,Y) = E[(X - \mu_X)(Y - \mu_Y)]$
where