基本概率分布Basic Concept of Probability Distributions 1: Binomial Distribution

PDF下载链接

PMF

If the random variable $X$ follows the binomial distribution with parameters $n$ and $p$, we write $X \sim B(n, p)$. The probability of getting exactly $x$ successes in $n$ trials is given by the probability mass function: $$f(x; n, p) = \Pr(X=x) = {n\choose x}p^{x}(1-p)^{n-x}$$ for $x=0, 1, 2, \cdots$ and ${n\choose x} = {n!\over(n-x)!x!}$.

Proof:

$$ \begin{align*} \sum_{x=0}^{\infty}f(x; n, p) &= \sum_{x=0}^{\infty}{n\choose x}p^{x}(1-p)^{n-x}\\ &= [p + (1-p)]^{n}\;\;\quad\quad \mbox{(binomial theorem)}\\ &= 1 \end{align*} $$

Mean

The expected value is $$\mu = E[X] = np$$

Proof:

$$ \begin{align*} E\left[X^k\right] &= \sum_{x=0}^{\infty}x^{k}{n\choose x}p^{x}(1-p)^{n-x}\\ &= \sum_{x=1}^{\infty}x^{k}{n\choose x}p^{x}(1-p)^{n-x}\\ &= np\sum_{x=1}^{\infty}x^{k-1}{n-1\choose x-1}p^{x-1}(1-p)^{n-x}\quad\quad\quad (\mbox{identity}\ x{n\choose x} = n{n-1\choose x-1})\\ &= np\sum_{y=0}^{\infty}(y+1)^{k-1}{n-1\choose y}p^{y}(1-p)^{n-1-y}\quad(\mbox{substituting}\ y=x-1)\\ &= npE\left[(Y + 1)^{k-1}\right] \quad\quad\quad \quad\quad\quad \quad\quad\quad\quad\quad (Y\sim B(n-1, p)) \\ \end{align*} $$ Using the identity $$ \begin{align*} x{n\choose x} &= {x\cdot n!\over(n-x)!x!}\\ & = {n!\over(n-x)!(x-1)!}\\ &= n{(n-1)!\over[(n-1)-(x-1)]!(x-1)!}\\ &= n{n-1\choose x-1} \end{align*} $$ Hence setting $k=1$ we have $$E[X] = np$$

Variance

The variance is $$\sigma^2 = \mbox{Var}(X) = np(1-p)$$

Proof:

$$ \begin{align*} \mbox{Var}(X) &= E\left[X^2\right] - E[X]^2\\ &= npE[Y+1] - n^2p^2\\ & = np\left(E[Y] + 1\right) - n^2p^2\\ & = np[(n-1)p + 1] - n^2p^2\quad\quad (Y\sim B(n-1, p))\\ &= np(1-p) \end{align*} $$

Examples

1. Let $X$ be binomially distributed with parameters $n=10$ and $p={1\over2}$. Determine the expected value $\mu$, the standard deviation $\sigma$, and the probability $P\left(|X-\mu| \geq 2\sigma\right)$. Compare with Chebyshev‘s Inequality.

Solution:

The binomial mass function is $$f(x) ={n\choose x} p^x \cdot q^{n-x},\ x=0, 1, 2, \cdots$$ where $q=1-p$. The expected value and the standard deviation are $$E[X] = np=5,\ \sigma = \sqrt{npq} = 1.581139$$ The probability that $X$ takes a value more than two standard deviations from $\mu$ is $$ \begin{align*} P\left(|X-\mu| \geq 2\sigma\right) &= P\left(|X-5| \geq 3.2\right)\\ &= P(X\leq 1) + P(X \geq9)\\ &= \sum_{x=0}^{1}{10\choose x}p^{x}(1-p)^{10-x} + \sum_{x=9}^{\infty}{10\choose x}p^{x}(1-p)^{10-x}\\ & = 0.02148437 \end{align*} $$ R code:

sum(dbinom(c(0, 1), 10, 0.5)) + 1 - sum(dbinom(c(0:8), 10, 0.5))
# [1] 0.02148437
pbinom(1, 10, 0.5) + 1 - pbinom(8, 10, 0.5)
# [1] 0.02148438 

Chebyshev‘s Inequality gives the weaker estimation $$P\left(|X - \mu| \geq 2\sigma\right) \leq {1\over2^2} = 0.25$$

2. What is the probability $P_1$ of having at least six heads when tossing a coin ten times?

Solution:

$$ \begin{align*} P(X \geq 6) &= \sum_{x=6}^{10}{10\choose x}0.5^{x}0.5^{10-x}\\ &= 0.3769531 \end{align*} $$ R code:

1 - pbinom(5, 10, 0.5)
# [1] 0.3769531
sum(dbinom(c(6:10), 10, 0.5))
# [1] 0.3769531 

3. What is the probability $P_2$ of having at least 60 heads when tossing a coin 100 times?

Solution:

$$ \begin{align*} P(X \geq 60) &= \sum_{x=60}^{100}{100\choose x}0.5^{x}0.5^{100-x}\\ &= 0.02844397 \end{align*} $$ R code:

1 - pbinom(59, 100, 0.5)
# [1] 0.02844397
sum(dbinom(c(60:100), 100, 0.5))
# [1] 0.02844397 

Alternatively, we can use normal approximation (generally when $np > 5$ and $n(1-p) > 5$). $\mu = np=50$ and $\sigma = \sqrt{np(1-p)} = \sqrt{25}$. $$ \begin{align*} P(X \geq 60) &= 1 - P(X \leq 59)\\ &= 1- \Phi\left({59.5-50\over \sqrt{25}}\right)\\ &= 1-\Phi(1.9)\\ &= 0.02871656 \end{align*} $$ R code:

1 - pnorm(1.9)
# [1] 0.02871656 

4. What is the probability $P_3$ of having at least 600 heads when tossing a coin 1000 times?

Solution: $$ \begin{align*} P(X \geq 600) &= \sum_{x=600}^{1000}{1000\choose x} 0.5^{x} 0.5^{1000-x}\\ &= 1.364232\times10^{-10} \end{align*} $$ R code:

sum(dbinom(c(600:100), 1000, 0.5))
# [1] 1
sum(dbinom(c(600:1000), 1000, 0.5))
# [1] 1.364232e-10 

Alternatively, we can use normal approximation. $\mu = np=500$ and $\sigma = \sqrt{np(1-p)} = \sqrt{250}$. $$ \begin{align*} P(X \geq 600) &= 1 - P(X \leq 599)\\ &= 1- \Phi\left({599.5-500\over \sqrt{250}}\right)\\ &= 1.557618 \times 10^{-10} \end{align*} $$ R code:

1 - pnorm(99.5/sqrt(250))
# [1] 1.557618e-10 

Reference

  1. Ross, S. (2010). A First Course in Probability (8th Edition). Chapter 4. Pearson. ISBN: 978-0-13-603313-4.
  2. Brink, D. (2010). Essentials of Statistics: Exercises. Chapter 5 & 8. ISBN: 978-87-7681-409-0.
时间: 2024-10-12 07:54:31

基本概率分布Basic Concept of Probability Distributions 1: Binomial Distribution的相关文章

基本概率分布Basic Concept of Probability Distributions 5: Hypergemometric Distribution

PDF version PMF Suppose that a sample of size $n$ is to be chosen randomly (without replacement) from an urn containing $N$ balls, of which $m$ are white and $N-m$ are black. If we let $X$ denote the number of white balls selected, then $$f(x; N, m,

基本概率分布Basic Concept of Probability Distributions 2: Poisson Distribution

PDF version PMF A discrete random variable $X$ is said to have a Poisson distribution with parameter $\lambda > 0$, if the probability mass function of $X$ is given by $$f(x; \lambda) = \Pr(X=x) = e^{-\lambda}{\lambda^x\over x!}$$ for $x=0, 1, 2, \cd

基本概率分布Basic Concept of Probability Distributions 8: Normal Distribution

PDF version PDF & CDF The probability density function is $$f(x; \mu, \sigma) = {1\over\sqrt{2\pi}\sigma}e^{-{1\over2}{(x-\mu)^2\over\sigma^2}}$$ The cumulative distribution function is defined by $$F(x; \mu, \sigma) = \Phi\left({x-\mu\over\sigma}\ri

基本概率分布Basic Concept of Probability Distributions 6: Exponential Distribution

PDF version PDF & CDF The exponential probability density function (PDF) is $$f(x; \lambda) = \begin{cases}\lambda e^{-\lambda x} & x\geq0\\ 0 & x < 0 \end{cases}$$ The exponential cumulative distribution function (CDF) is $$F(x; \lambda) =

基本概率分布Basic Concept of Probability Distributions 3: Geometric Distribution

PDF version PMF Suppose that independent trials, each having a probability $p$, $0 < p < 1$, of being a success, are performed until a success occurs. If we let $X$ equal the number of failures required, then the geometric distribution mass function

基本概率分布Basic Concept of Probability Distributions 7: Uniform Distribution

PDF version PDF & CDF The probability density function of the uniform distribution is $$f(x; \alpha, \beta) = \begin{cases}{1\over\beta-\alpha} & \mbox{if}\ \alpha < x < \beta\\ 0 & \mbox{otherwise} \end{cases} $$ The cumulative distribu

基本概率分布Basic Concept of Probability Distributions 4: Negative Binomial Distribution

PDF version PMF Suppose there is a sequence of independent Bernoulli trials, each trial having two potential outcomes called "success" and "failure". In each trial the probability of success is $p$ and of failure is $(1-p)$. We are obs

3.Discrete Random Variables and Probability Distributions

1. Random Variables Random variables  variable: because different numerical values are possible; random: because the observed value depends on which of the possible experimental outcomes results. For a given sample space δ of some experiment, a rando

[Information Theory] L14: Approximating Probability Distributions (IV): Variational Methods

alternatively update Q_{\miu} and Q_{\sigma} another example is the spin system: a nasty thing here is the coupling term in E(x;J) and we use another decoupling Q(x;a) to fit two spin system example: less on {-1,1} and {1,-1}, higher on {-1,-1} and {