Probablity & Statistics

Back to Home

Last updated: 28 Oct 2025

Basics

Set Theory

  1. $\ A \subseteq B => A\ subset\ of\ B $
  2. $\ A = B => A\subseteq B \ and \ B\subseteq A$
  3. Empty set ($\phi$) => Set with no element => $\phi$ is contained in every set

Operations with sets

$A\cup B = { x: x \in A\ or\ x \in B }$
$A\cap B = { x: x \in A\ and \ x \in B }$
$A^c = { x : x\notin A }$

Suppose we have an $\Gamma$ be indexing set and we have {$A_\alpha ,\ \alpha \in \Gamma$} be collection of sets indexed by $\Gamma$

then , $\bigcup\limits_{\alpha\in\Gamma} A_\alpha = { x : x \in A_\alpha\ for\ some\ \alpha \in \Gamma}$

and $\bigcap\limits_{\alpha\in\Gamma} A_\alpha = { x : x \in A_\alpha\ for\ every\ \alpha \in \Gamma}$

  1. A and B are disjoint if $A \cap B = \phi$, A and B are Mutually Exclusive

  2. $A_1, A_2, ..... $ are pairwise disjoint if $A_i \cap A_j = \phi \qquad \forall\ i \neq j$

  3. $A_1, A_2, ..... $ is a partition of S if :

    • $A_1, A_2,...$ are pairwise disjoint

    • $\bigcup\limits_{i} A_i = S$

Sigma Algebra

Probablity Function

Conditional Probablity & Independence

Bayes Theorum

Counting

\ Without Replacement With Replacement
Ordered ${}^nP_k = \frac{n!}{(n-1)!}$ $n^k$
Unordered ${}^nC_k = \frac{n!}{(n-k)!\ *\ k!}$ ${}^{n+k-1}C_k$

Random Variables

$x\in \chi$ 0 1 2 3 4
P(X=$x$) 1/16 4/16 6/16 4/16 1/16

Probablity Distribution of Random Variables

X is Discrete X is Continuous
$F_X(x)$ is step fn $F_X(x)$ is continuous fn
Probablity Mass fn pmf is $p_X(x) = P(X=x),\ \forall x$ Probablity density fn pdf is $F_X(x) = \int\limits_{-\infty}\limits^x f_X(t)\ dt,\ \forall x$
$\chi$ is discrete subset of $\mathbb R$. $\chi$ is a union of intervals in $\mathbb R$.
CDF:  $F_X(x) = P(X\le x) = \sum\limits_{y\in \chi,\ y\le x} p_X(y)$ CDF :  $F_X(x) $ = $P(X\le x) $ = $\int\limits_{-\infty}\limits^x f_X(t)  dt$
$P(a\le X \le b) = F(b) - F(a^-)$
$a^-$ is largest possible val of X strictly less than a
$P(a \le X \le b) $ = $\int\limits_a\limits^b f_X(x)  dx$
= $F_X(b) - F_X(a)$

NOTE :

Expectation of Random Variables

Variance of Random Variable

Properties of Expectation and Variance

Continuous Random Variable

Discrete Random Variables

  1. Uniform Discrete RV

    • $p_X(x) = P(X=x) = \frac{1}{N}$

    • E(X) = $\frac{N+1}{2}$

    • V(X) = $\frac{N^2-1}{12}$

Calculation of $p_X(x) = P(X=x)$ depends on if

  1. Poisson's Distribution

    • $p_X(x) = \frac{e^{-\lambda} \cdot \lambda^x}{x!} \qquad where\ e^\lambda= 1+\lambda + \frac{\lambda^2}{2!} + \frac{\lambda^3}{3!} + .....$

    • we assume $p_X(x) \ge 0$ for legitimate PMF

    • $E(X) = \lambda$

    • $V(X) = \lambda$

  2. Negative Binomial Distribution

    • Do repeated trials with P(success) = p till r success are observed with x failures

    • $p_X(x) = (^{x+r-1}x) \cdot p^r \cdot (1-p)^x $ = $(^{n-1}{n-r}) \cdot p^{n-x} \cdot (1-p)^{n-r}$ in terms of n; n=x+r

    • E(X) = $\frac{r(1-p)}{p}$

    • V(X) = $\frac{r(1-p)}{p^2}$

  3. Geometric Distribution

    • Repeatedly do trials till one success.

    • Its special case of negative binomial pmf with r=1.

    • $p_X(x) = p(1-p)^{x-1}$

    • $E(X) = \frac{1}{p}$

    • $V(X) = \frac{(1-p)}{p^2}$

    • NOTE :

    • if $ N,M \rightarrow \infty\ $ such that$\frac{M}{N} \rightarrow p : hypergeometric(x;N,M,n) \rightarrow binomial(x;n,p) $

    • if $n\rightarrow \infty\ and \ p\rightarrow0\ $ such that $np\rightarrow\lambda : binomial(x;n,p) $ $ \rightarrow poissons(x;\lambda)$

Uniform Continuous Dist - Cont Rand Variable

Normal Distribution - Continuous Random Variable

Gamma Distribution - Continuous Random Variable

Log Normal Dist - CRV

Beta Dist

Cauchy Dist