Skip to content

Latest commit

 

History

History
14 lines (12 loc) · 1.1 KB

01-資訊科學方程式.md

File metadata and controls

14 lines (12 loc) · 1.1 KB

資訊科學方程式

  1. The running time of an algorithm: $T(n) = O(f(n))$
  2. The space complexity of an algorithm: $S(n) = O(f(n))$
  3. The Shannon entropy: $H = -\sum_{i=1}^{n} p_i \log_2 p_i$
  4. The mutual information: $I(X;Y) = \sum_{y \in Y} \sum_{x \in X} p(x,y) \log_2 \frac{p(x,y)}{p(x)p(y)}$
  5. The Kullback-Leibler divergence: $D_{KL}(P||Q) = \sum_{i} P(i) \log_2 \frac{P(i)}{Q(i)}$
  6. The Bayes theorem: $P(A|B) = \frac{P(B|A)P(A)}{P(B)}$
  7. The probability of an event: $P(E) = \frac{n(E)}{n(S)}$
  8. The expected value: $E[X] = \sum_{i=1}^{n} x_i p_i$
  9. The variance: $Var(X) = E[(X - \mu)^2]$
  10. The covariance: $Cov(X,Y) = E[(X - \mu_X)(Y - \mu_Y)]$

where $T(n)$ is the running time of an algorithm, $S(n)$ is the space complexity, $H$ is the Shannon entropy, $I(X;Y)$ is the mutual information, $D_{KL}(P||Q)$ is the Kullback-Leibler divergence, $P(A|B)$ is the conditional probability of event A given event B, $P(E)$ is the probability of an event, $E[X]$ is the expected value of a random variable X, $Var(X)$ is the variance of X, and $Cov(X,Y)$ is the covariance between random variables X and Y.