probability & statistics

descriptive statistics

median index is (n+1)/2

probability basics

AND and OR rule for a discrete single variable

why we multiply in tree diagrams

P(A|B) = P(A) ⇒ P(B|A) = P(B)

expected value and variance basics

E[X + Y] = E[X] + E[Y]

E[cX] = c * E[X] where c is a constant

E[XY] = E[X] * E[Y] where X and Y are independent variables

measuring the spread of data: mean absolute deviation and mean squared deviation

Var[X] = E[X^2] - (E[X]^2)

Var[kX] = k^2 * Var[X]

Var[X + Y] = Var[X] + Var[Y] where X and Y are independent variables

deriving the E[X] and Var[X] for the iid variables

estimating the population variance using the sample mean and the sample size

Markov's inequality

Chebyshev's inequality

probability distribution stuff

expected value and variance of a binomial distribution

expected value and variance of a geometric distribution

expected value and variance of a geometric distribution (when success isn't included) (incomplete)

probability function of the Poisson distribution

if X is a positive continuous random variable with a memoryless property, then X is exponentially distributed

expected value and variance of an exponential distribution

miscellaneous

deriving the regression coefficient of y on x