基本概率分布Basic Concept of Probability Distributions 5: Hypergemometric Distribution

PDF version

PMF

Suppose that a sample of size $n$ is to be chosen randomly (without replacement) from an urn containing $N$ balls, of which $m$ are white and $N-m$ are black. If we let $X$ denote the number of white balls selected, then $$f(x; N, m, n) = \Pr(X = x) = {{m\choose x}{N-m\choose n-x}\over {N\choose n}}$$ for $x= 0, 1, 2, \cdots, n$.

Proof:

This is essentially the Vandermonde‘s identity: $${m+n\choose r} = \sum_{k=0}^{r}{m\choose k}{n\choose r-k}$$ where $m$, $n$, $k$, $r\in \mathbb{N}_0$. Because $$ \begin{align*} \sum_{r=0}^{m+n}{m+n\choose r}x^r &= (1+x)^{m+n} \quad\quad\quad\quad\quad\quad\quad\quad \mbox{(binomial theorem)}\\ &= (1+x)^m(1+x)^n\\ &= \left(\sum_{i=0}^{m}{m\choose i}x^{i}\right)\left(\sum_{j=0}^{n}{n\choose j}x^{j}\right)\\ &= \sum_{r=0}^{m+n}\left(\sum_{k=0}^{r}{m\choose k}{n\choose r-k}\right)x^r \quad\quad\mbox{(product of two binomials)} \end{align*} $$ Using the product of two binomials: $$ \begin{eqnarray*} \left(\sum_{i=0}^{m}a_i x^i\right)\left(\sum_{j=0}^{n}b_j x^j\right) &=& \left(a_0+a_1x+\cdots + a_mx^m\right)\left(b_0+b_1x+\cdots + b_nx^n\right)\\ &=& a_0b_0 + a_0b_1x +a_1b_0x +\cdots +a_0b_2x^2 + a_1b_1x^2 + a_2b_0x^2 +\\ & &\cdots + a_mb_nx^{m+n}\\ &=& \sum_{r=0}^{m+n}\left(\sum_{k=0}^{r}a_{k}b_{r-k}\right)x^{r} \end{eqnarray*} $$ Hence $$ \begin{eqnarray*} & &\sum_{r=0}^{m+n}{m+n\choose r}x^r = \sum_{r=0}^{m+n}\left(\sum_{k=0}^{r}{m\choose k}{n\choose r-k}\right)x^r\\ &\implies& {m+n\choose r} = \sum_{k=0}^{r}{m\choose k}{n\choose r-k}\\ & \implies& \sum_{k=0}^{r}{{m\choose k}{n\choose r-k}\over {m+n\choose r}} = 1 \end{eqnarray*} $$

Mean

The expected value is $$\mu = E[X] = {nm\over N}$$

Proof:

$$ \begin{eqnarray*} E[X^k] &=& \sum_{x=0}^{n}x^kf(x; N, m, n)\\ &=& \sum_{x=0}^{n}x^k{{m\choose x}{N-m\choose n-x}\over {N\choose n}}\\ &=& {nm\over N}\sum_{x=0}^{n} x^{k-1} {{m-1 \choose x-1}{N-m\choose n-x}\over {N-1 \choose n-1}}\\ & & (\mbox{identities:}\ x{m\choose x} = m{m-1\choose x-1},\ n{N\choose n} = N{N-1\choose n-1})\\ &=& {nm\over N}\sum_{x=0}^{n} (y+1)^{k-1} {{m-1 \choose y}{(N-1) - (m - 1)\choose (n-1)-y}\over {N-1 \choose n-1}}\quad\quad(\mbox{setting}\ y=x-1)\\ &=& {nm\over N}E\left[(Y+1)^{k-1}\right] \quad\quad\quad \quad\quad \quad\quad\quad\quad (\mbox{since}\ Y\sim g(y; m-1, n-1, N-1)) \end{eqnarray*} $$ Hence, setting $k=1$ we have $$E[X] = {nm\over N}$$ Note that this follows the mean of the binomial distribution $\mu = np$, where $p = {m\over N}$.

Variance

The variance is $$\sigma^2 = \mbox{Var}(X) = np(1-p)\left(1 - {n-1 \over N-1}\right)$$ where $p = {m\over N}$.

Proof:

$$ \begin{align*} E[X^2] &= {nm\over N}E[Y+1] \quad\quad\quad \quad\quad\quad \quad (\mbox{setting}\ k=2)\\ &= {nm\over N}\left(E[Y] + 1\right)\\ & = {nm\over N}\left[{(n-1) (m-1) \over N-1}+1\right] \end{align*} $$ Hence the variance is $$ \begin{align*} \mbox{Var}(X) &= E\left[X^2\right] - E[X]^2\\ &= {mn\over N}\left[{(n-1) (m-1) \over N-1}+1 - {nm\over N}\right]\\ &= np \left[ (n-1) \cdot {pN-1\over N-1}+1-np\right] \quad\quad \quad \quad \quad\quad(\mbox{setting}\ p={m\over N})\\ &= np\left[(n-1)\cdot {p(N-1) + p -1 \over N-1} + 1 -np\right]\\ &= np\left[(n-1)p + (n-1)\cdot{p-1 \over N-1} + 1-np\right]\\ &= np\left[1-p - (1-p)\cdot {n-1\over N-1}\right] \\ &= np(1-p)\left(1 - {n-1 \over N-1}\right) \end{align*} $$ Note that it is approximately equal to 1 when $N$ is sufficient large (i.e. ${n-1\over N-1}\rightarrow 0$ when $N\rightarrow +\infty$). And then it is the same as the variance of the binomial distribution $\sigma^2 = np(1-p)$, where $p = {m\over N}$.

Examples

1. At a lotto game, seven balls are drawn randomly from an urn containing 37 balls numbered from 0 to 36. Calculate the probability $P$ of having exactly $k$ balls with an even number for $k=0, 1, \cdots, 7$.

Solution:

$$P(X = k) = {{19\choose k}{18\choose 7-k}\over {37 \choose 7}}$$

p = NA; k = 0:7
for (i in k){
+   p[i+1] = round(choose(19, i) * choose(18, 7-i)
+                  / choose(37, 7), 3)
+ }
p
# [1] 0.003 0.034 0.142 0.288 0.307 0.173 0.047 0.005

2. Determine the same probabilities as in the previous problem, this time using the normal approximation.

Solution:

The mean is $$\mu = {nm\over N} = {7\times19\over 37} = 3.594595$$ and the standard deviation is $$\sigma = \sqrt{{nm\over N}\left(1-{m\over N}\right)\left(1 - {n-1\over N-1}\right)} = \sqrt{{7\times19\over 37}\left(1 - {19\over 37}\right) \left(1 - {7-1\over 37-1}\right)} = 1.207174$$ The probability of normal approximation is

p = NA; k = 0:7
mu = 7 * 19 / 37
s = sqrt(7 * 19 / 37 * (1 - 19/37) * (1 - 6/36))
for (i in k){
+   p[i+1] = round(dnorm(i, mu, s), 3)
+ }
p
# [1] 0.004 0.033 0.138 0.293 0.312 0.168 0.045 0.006

Reference

  1. Ross, S. (2010). A First Course in Probability (8th Edition). Chapter 4. Pearson. ISBN: 978-0-13-603313-4.
  2. Brink, D. (2010). Essentials of Statistics: Exercises. Chapter 11. ISBN: 978-87-7681-409-0.
时间: 2024-08-08 20:56:21

基本概率分布Basic Concept of Probability Distributions 5: Hypergemometric Distribution的相关文章

基本概率分布Basic Concept of Probability Distributions 2: Poisson Distribution

PDF version PMF A discrete random variable $X$ is said to have a Poisson distribution with parameter $\lambda > 0$, if the probability mass function of $X$ is given by $$f(x; \lambda) = \Pr(X=x) = e^{-\lambda}{\lambda^x\over x!}$$ for $x=0, 1, 2, \cd

基本概率分布Basic Concept of Probability Distributions 1: Binomial Distribution

PDF下载链接 PMF If the random variable $X$ follows the binomial distribution with parameters $n$ and $p$, we write $X \sim B(n, p)$. The probability of getting exactly $x$ successes in $n$ trials is given by the probability mass function: $$f(x; n, p) =

基本概率分布Basic Concept of Probability Distributions 8: Normal Distribution

PDF version PDF & CDF The probability density function is $$f(x; \mu, \sigma) = {1\over\sqrt{2\pi}\sigma}e^{-{1\over2}{(x-\mu)^2\over\sigma^2}}$$ The cumulative distribution function is defined by $$F(x; \mu, \sigma) = \Phi\left({x-\mu\over\sigma}\ri

基本概率分布Basic Concept of Probability Distributions 6: Exponential Distribution

PDF version PDF & CDF The exponential probability density function (PDF) is $$f(x; \lambda) = \begin{cases}\lambda e^{-\lambda x} & x\geq0\\ 0 & x < 0 \end{cases}$$ The exponential cumulative distribution function (CDF) is $$F(x; \lambda) =

基本概率分布Basic Concept of Probability Distributions 3: Geometric Distribution

PDF version PMF Suppose that independent trials, each having a probability $p$, $0 < p < 1$, of being a success, are performed until a success occurs. If we let $X$ equal the number of failures required, then the geometric distribution mass function

基本概率分布Basic Concept of Probability Distributions 7: Uniform Distribution

PDF version PDF & CDF The probability density function of the uniform distribution is $$f(x; \alpha, \beta) = \begin{cases}{1\over\beta-\alpha} & \mbox{if}\ \alpha < x < \beta\\ 0 & \mbox{otherwise} \end{cases} $$ The cumulative distribu

基本概率分布Basic Concept of Probability Distributions 4: Negative Binomial Distribution

PDF version PMF Suppose there is a sequence of independent Bernoulli trials, each trial having two potential outcomes called "success" and "failure". In each trial the probability of success is $p$ and of failure is $(1-p)$. We are obs

3.Discrete Random Variables and Probability Distributions

1. Random Variables Random variables  variable: because different numerical values are possible; random: because the observed value depends on which of the possible experimental outcomes results. For a given sample space δ of some experiment, a rando

[Information Theory] L14: Approximating Probability Distributions (IV): Variational Methods

alternatively update Q_{\miu} and Q_{\sigma} another example is the spin system: a nasty thing here is the coupling term in E(x;J) and we use another decoupling Q(x;a) to fit two spin system example: less on {-1,1} and {1,-1}, higher on {-1,-1} and {