Discrete Probability Distribution

Basic Concept

$x$: random variable, where $x \in \mathbb{R} \quad or \quad x \in \mathbb{Z}$

$P(x=k)=a$: For the event $x=k$, his probability is $a$. Note that the probability of any event is greater than or equal to 0 and less than or equal to one.

for example, x: outcome of coin tossing. tail:0, head:1.
$$P(x=0)=\frac{1}{2}, \quad P(x=1)=\frac{1}{2}$$

Notes: for a discrete probability distribution, the sum of the probabilities of all events must be 1, i.e:
$$\sum_{i=-\infty}^{\infty} P(x=i) = 1$$

This is often used as a way of checking whether a probability distribution is valid.

Expectation: $\mathbb{E}[X]=\sum\limits_{i}i\times P(x=i)$

Variance: $Var(X) = \mathbb{E}[x^2]-\mathbb{E}[x]^2$. Need attention,
$$\mathbb{E}[f(x)]=\sum\limits_{i}f(i)P(x=i)$$
so that, $\mathbb{E}[x^2] = \sum\limits_{i}x^2P(x=i)$

Mode/Modal Value: $P(X=x)$ is the greatest.

Algebra of expectation and vatiance (a, b, and c are constants):

$$\mathbb{E}[aX+bY+c]=a\mathbb{E}[X]+b\mathbb{E}[Y]+c$$
$$Var(aX+b)=a^2Var(x)$$

when X, Y are independent (a, b, and c are constants):

$$\mathbb{E}[XY]=\mathbb{E}[X] \times \mathbb{e}[Y]$$
$$Var(aX+bY+c)=a^2Var(X)+b^2Var(Y)$$

Binomial Distribution

Symbol: $x \sim B(n,p)$, where$n \in \mathbb{N}$, $P \in (0,1)$. $\sim$ means “distribution”

Definition: $P(x=r)= {n\choose r} p^r (1-p)^{n-r}$

Expectation: $\mathbb{E}[x]=np$. Proof:
$$\begin{equation}
\begin{aligned}
Var(x)&= \sum_{r=0}^{n} r \cdot P(x=r) \\
&= \sum_{r=0}^{n} r \cdot \frac{n!}{r!(n-r)!} p^{r}(1-p)^{n-r} \\
&= np\sum_{r=1}^{n} \frac{(n-1)!}{(r-1)!(n-r)!}p^{r-1}(1-p)^{n-r} \\
&= np \cdot (1-p+p)^{n-1} \\
&=np
\end{aligned}
\end{equation}
$$

Varience: $Var(x)=np(1-p)$

Poisson Distribution

Symbol: $x \sim Po(\lambda)$, he number of occurrences of an event in a given “interval”. where $x \in \{1, 2, …, n \}$.

Definition: $P(x=r)= e^{-\lambda} \cdot \frac{\lambda^r}{r!}$. Verify:

$$\begin{equation}
\begin{aligned}
\sum_{r=0}^{\infty} P(x=r)&= \sum_{r=0}^{\infty} e^{-\lambda} \frac{\lambda^r}{r!} \\
&= e^{-\lambda} \sum_{r=0}^{\infty}\frac{\lambda^r}{r!} \\
&= e^{-\lambda} \cdot e^{\lambda} \\
&= 1
\end{aligned}
\end{equation}
$$

Expectation: $\mathbb{E}[x]=\lambda$

Variance: $Var(x)=\lambda$

Notes:

  • If the number of occurrences in an interval of length T follows a Poisson distribution with parameter $\lambda$, then the number of occurrences in an interval of length kT follows Poisson distribution with mean $k\lambda$.
  • If X, Y are 2 independent Poisson Distribution with mean $\lambda$, $\mu$, then $X+Y$ has Poisson distribution with mean $\lambda+\mu$, i.e.
    $$P(X+Y=r)=e^{-(\lambda+\mu)} \frac{(\lambda+\mu)^r}{r!}$$
  • If n is “large” and p is “small”, then a Poisson distribution with mean np can be used to approximate Binomial distribution $x \sim B(n, p)$.

Geometric Distribution

Symbol: $x \sim Geo(p)$, where p stands for probability of success.

Definition: $P(x=k)=p \cdot (1-p)^{k-1}$. Verify:

$$\begin{equation}
\begin{aligned}
\sum_{r=0}^{\infty} P(x=k)&= \sum_{r=0}^{\infty} p \cdot (1-p)^{k-1} \\
&= p \cdot \frac{1}{1-(1-p)}\\
&= 1
\end{aligned}
\end{equation}
$$

Expectation: $\mathbb{E}[x]=\frac{1}{p}$ Proof:

$$\begin{equation}
\begin{aligned}
\mathbb{E}[x]&= \sum_{r=0}^{\infty} kp(1-p)^{k-1} \\
&= p \cdot \sum_{r=0}^{\infty}kq^{k-1}\\
&= p \cdot \sum_{r=0}^{\infty} \frac{\mathrm{d}}{\mathrm{d}q}(q^k) \\
&= p \cdot \frac{\mathrm{d}}{\mathrm{d}q} (\sum_{r=0}^{\infty} q^k) \\
&= p \cdot \frac{\mathrm{d}}{\mathrm{d}q} (\frac{1}{1-q}) \\
&= p \cdot \frac{1}{p^2} \\
&= \frac{1}{p}
\end{aligned}
\end{equation}
$$

Variance: $Var(x)=\frac{1-p}{p^2}$

Author

Evan Mi

Posted on

2024-05-28

Updated on

2024-05-29

Licensed under

Your browser is out-of-date!

Update your browser to view this website correctly.&npsb;Update my browser now

×