작성중...


INDEX



Uniform Random Variable

Exponential Random Variable

Erlang Random Variable

Gaussian Random Variable

Bivariate Gaussian Random Variable

Gaussian Random Vector

Beta Random Vector




Uniform Random Variable


\(X\) is a \(Uniform(a, b)\) random variable if the PDF of \(X\) is

$$f_{X}(x) = \begin{cases} 1/(b-a) & \quad a\leq x \leq b\\ 0 & \quad \text{otherwise}\\ \end{cases} $$

where the two parameters are \(b > a\).

$$F_{X}(x) = \begin{cases} 0 & \quad x \leq a\\ (x - a)/(b - a) & \quad a < x \leq b\\ 1 & \quad x > b\\ \end{cases} $$

$$\text{Expected value} \quad E[X] = (b + a)/2.$$

$$\text{Variance} \quad Var[X] = (b - a)^2/12.$$


PDF of Uniform DistributionPDF of Uniform Distribution (WIKIPEDIA)

CDF of Uniform DistributionCDF of Uniform Distribution (WIKIPEDIA)



Exponential Random Variable


\(X\) is a \(exponential(\lambda)\) random variable if the PDF of \(X\) is

$$f_{X}(x) = \begin{cases} \lambda e^{-\lambda x} & \quad x \geq 0\\ 0 & \quad \text{otherwise.}\\ \end{cases} $$

where the parameter \(\lambda > 0\).

$$F_{X}(x) = \begin{cases} 1-e^{-\lambda x} & \quad x \geq 0\\ 0 & \quad \text{otherwise.}\\ \end{cases} $$

$$\text{Expected value} \quad E[X] = 1/\lambda.$$

$$\text{Variance} \quad Var[X] = 1/\lambda^2.$$


PDF of Exponential DistributionPDF of Exponential Distribution (WIKIPEDIA)

CDF of Exponential DistributionCDF of Exponential Distribution (WIKIPEDIA)



Erlang Random Variable


\(X\) is a \(Erlang(n, \lambda)\) random variable if the PDF of \(X\) is

$$f_{X}(x) = \begin{cases} \frac{\lambda^n x^{n-1} e^{-\lambda x}}{(n - 1)!} & \quad x \geq 0\\ 0 & \quad \text{otherwise.}\\ \end{cases} $$

where the parameter \(\lambda > 0 \)and the parameter \(n \geq 1\) is an integer.

$$\text{Expected value} \quad E[X] = \frac{n}{\lambda}.$$

$$\text{Variance} \quad Var[X] = \frac{n}{\lambda^2}.$$


PDF of Erlang DistributionPDF of Erlang Distribution (WIKIPEDIA)

CDF of Erlang DistributionCDF of Erlang Distribution (WIKIPEDIA)



Gaussian Random Variable


\(X\) is a \(Gaussian(\mu, \sigma)\) random variable if the PDF of \(X\) is

$$f_X(x) = \frac{1}{\sqrt{2\pi \sigma^2}}e^{-(x - \mu)^2/2\sigma^2}$$

where the parameter \(\mu\) can be any real number and the parameter \(\sigma > 0\)

$$\text{Expected value} \quad E[X] = \mu.$$

$$\text{Variance} \quad Var[X] = \sigma^2.$$


PDF of Normal DistributionPDF of Normal Distribution (WIKIPEDIA)

CDF of Normal DistributionCDF of Normal Distribution (WIKIPEDIA)



Bivariate Gaussian Random Variable


Random variables \(X\) and \(Y\) have a bivariate Gaussian PDF with parameters \(\mu_1, \sigma_1, \mu_2, \sigma_2,\) and \(\rho\) if

$$f_{X, Y}(x, y) = \frac{\exp(-\frac{\Big(\frac{x - \mu_1}{\sigma_1}\Big)^2 - \frac{2\rho (x - \mu_1)(y - \mu_2)}{\sigma_1 \sigma_2} + \Big(\frac{y - \mu_2}{\sigma_2}\Big)^2}{2(1 - \rho^2)})}{2\pi \sigma_1 \sigma_2 \sqrt{1 - \rho^2}} $$

where \(\mu_1\) and \(\mu_2\) can be any real numbers, \(\sigma_1 >0, \sigma_2 >0 \), and \( -1 < \rho < 1\) .





Gaussian Random Vector


\(\mathbf{X}\) is the \(Gaussian(\mathbf{\mu_X, C_X})\) random vector with expected value \(\mathbf{\mu_X}\) and covariance \(\mathbf{C_X}\) if and only if

$$f_{\mathbf{X}}(\mathbf{x}) = \frac{1}{(2\pi)^{n/2}[det(\mathbf{C_X})]^{1/2}}\exp\bigg(-\frac{1}{2}(\mathbf{x}-\mathbf{\mu_X})^T \mathbf{C_X^{-1}} (\mathbf{x} - \mathbf{\mu_X}) \bigg) $$

where \(det(\mathbf{C_X})\), the determinant of \(\mathbf{C_X}\), satisfies \(det(\mathbf{C_X}) > 0\).




Beta Random Variable


\(X\) is a \(Beta(\alpha, \beta)\) random variable if the PDF of \(X\) is

$$f_{X}(x) = \frac{\Gamma (\alpha + \beta)}{\Gamma (\alpha) \Gamma (\beta)}x^{\alpha - 1}(1 - x)^{\beta - 1} = \frac{1}{B(\alpha, \beta)}x^{\alpha - 1}(1 - x)^{\beta - 1} \quad \quad (\because B(\alpha, \beta) = \frac{\Gamma(\alpha) \Gamma(\beta)}{\Gamma (\alpha + \beta)}) $$

where the parameter \( 0 \leq x \leq 1\), and shape parameters \(\alpha, \beta > 0\).

\(\Gamma(z) = \int_0^{\infty}x^{z - 1}e^{-x}\mathrm{d}x \quad \quad (\Gamma(n) = (n - 1)! , \text{if n is positive integer}).\).


$$\text{Expected value} \quad E[X] = \frac{\alpha}{\alpha + \beta}.$$

$$\text{Variance} \quad Var[X] = \frac{\alpha \beta}{(\alpha + \beta )^2(\alpha + \beta + 1)}.$$


PDF of Beta DistributionPDF of Beta Distribution (WIKIPEDIA)

CDF of Beta DistributionCDF of Beta Distribution (WIKIPEDIA)



Proof - Gaussian




\(X\) is a \(Gaussian(\mu, \sigma)\) random variable if the PDF of \(X\) is

$$f_X(x) = \frac{1}{\sqrt{2\pi \sigma^2}}e^{-(x - \mu)^2/2\sigma^2}$$


Expected Value

\( \int_{-\infty}^{\infty}\frac{1}{\sqrt{2\pi\sigma^2}}e^{-(x - \mu)^2/2\sigma^2}\mathrm{d}x\\ \)


\( \text{Let}\quad \omega = \frac{x - \mu}{\sqrt{2\sigma^2}}, \mathrm{d}\omega = \frac{\mathrm{d}x}{\sqrt{2\sigma^2}}\\ \)


\( \int_{-\infty}^{\infty}\frac{1}{\sqrt{2\pi\sigma^2}}e^{-\omega^2}\sqrt{2\sigma^2}\mathrm{d}\omega\\ =\frac{1}{\sqrt{\pi}}\int_{-\infty}^{\infty}e^{-\omega^2}\mathrm{d}\omega\\ \)


\( \begin{align} \bigg\{\int_{-\infty}^{\infty}e^{-\omega^2}\mathrm{d}\omega\bigg\}^2 &= \int_{-\infty}^{\infty}e^{-x^2}\mathrm{d}x \cdot \int_{-\infty}^{\infty}e^{-y^2}\mathrm{d}y\\ &= \int_{-\infty}^{\infty}\int_{-\infty}^{\infty}e^{-(x^2 + y^2)}\mathrm{d}x\mathrm{d}y\\ \end{align} \)


\( \text{Let}\quad x = r\cos{\theta}, \quad y = r\sin{\theta}\\ \)


\( \begin{align} \int_{0}^{2\pi}\int_{0}^{\infty}e^{-r^2}det(J(r,\theta))\mathrm{d}r\mathrm{d}\theta &= \int_{0}^{2\pi}\int_{0}^{\infty}e^{-r^2}r\mathrm{d}r\mathrm{d}\theta\\ &= 2\pi\int_{0}^{\infty}re^{-r^2}r\mathrm{d}r\\ \end{align} \)


\( \text{Let}\quad s = -r^2, \quad \mathrm{d}s = -2r\mathrm{d}r\\ \)


\( \begin{align} 2\pi\int_{0}^{-\infty}-\frac{1}{2}e^{s}\mathrm{d}s &= 2\pi\int_{-\infty}^{0}\frac{1}{2}e^{s}\mathrm{d}s\\ &= \pi\int_{-\infty}^{0}e^{s}\mathrm{d}s\\ &=\pi e^s \Big|_{-\infty}^{0}\\&=\pi \end{align} \)


\( \begin{align} \bigg\{\int_{-\infty}^{\infty}e^{-\omega^2}\mathrm{d}\omega\bigg\}^2 = \pi, \quad \int_{-\infty}^{\infty}e^{-\omega^2}\mathrm{d}\omega = \sqrt{\pi} \end{align} \)


\( \begin{align} \frac{1}{\sqrt{\pi}}\int_{-\infty}^{\infty}e^{-\omega^2}\mathrm{d}\omega = 1 \end{align} \)


\( \begin{align} \therefore \int_{-\infty}^{\infty}\frac{1}{\sqrt{2\pi\sigma^2}}e^{-(x - \mu)^2/2\sigma^2}\mathrm{d}x = 1 \end{align} \)


\( \begin{align} E[X] &= \int_{-\infty}^{\infty}x\frac{1}{\sqrt{2\pi\sigma^2}}e^{-(x - \mu)^2/2\sigma^2}\mathrm{d}x \\ &= \int_{-\infty}^{\infty}[x - \mu + \mu]\frac{1}{\sqrt{2\pi\sigma^2}}e^{-(x - \mu)^2/2\sigma^2}\mathrm{d}x\\ &= \int_{-\infty}^{\infty}(x - \mu)\frac{1}{\sqrt{2\pi\sigma^2}}e^{-(x - \mu)^2/2\sigma^2}\mathrm{d}x + \mu\int_{-\infty}^{\infty}\frac{1}{\sqrt{2\pi\sigma^2}}e^{-(x - \mu)^2/2\sigma^2}\mathrm{d}x\\ &= \int_{-\infty}^{\infty}(x - \mu)\frac{1}{\sqrt{2\pi\sigma^2}}e^{-(x - \mu)^2/2\sigma^2}\mathrm{d}x + \mu \quad \quad \bigg( \because \int_{-\infty}^{\infty}\frac{1}{\sqrt{2\pi\sigma^2}}e^{-(x - \mu)^2/2\sigma^2}\mathrm{d}x = 1 \bigg) \\ &= \frac{-\sigma^2}{\sqrt{2\pi\sigma^2}}\int_{-\infty}^{\infty}-\frac{(x - \mu)}{\sigma^2}e^{-(x - \mu)^2/2\sigma^2}\mathrm{d}x + \mu\\ &= \frac{-\sigma^2}{\sqrt{2\pi\sigma^2}}\bigg[ e^{-(x - \mu)^2/2\sigma^2}\bigg]_{-\infty}^{\infty} + \mu\\ &= \frac{-\sigma^2}{\sqrt{2\pi\sigma^2}}\bigg[ 0 - 0\bigg] + \mu\\ &= \mu \end{align} \)


Variance

\( Var[X] = \int_{-\infty}^{\infty}(x - \mu)^2\frac{1}{\sqrt{2\pi\sigma^2}}e^{-(x - \mu)^2/2\sigma^2}\mathrm{d}x\\ \)


\( \text{Let}\quad \omega = \frac{x - \mu}{\sqrt{2\sigma^2}}, \mathrm{d}\omega = \frac{\mathrm{d}x}{\sqrt{2\sigma^2}}\\ \)


\( \begin{align} Var[X] = \int_{-\infty}^{\infty}2\sigma^2 \omega^2 \frac{1}{\sqrt{2\pi\sigma^2}}e^{-(x - \mu)^2/2\sigma^2}\mathrm{d}\omega\\ =\frac{2\sigma^2}{\sqrt{\pi}}\int_{-\infty}^{\infty}\omega^2 e^{-\omega^2}\mathrm{d}\omega\\ \end{align} \)


\( \text{Intergration by parts by}\quad u = \omega , \quad v = -\frac{1}{2}e^{-\omega^2}, u^{\prime} = 1, v^{\prime} = \omega e^{-\omega^2}\quad \quad \bigg( \int uv^{\prime} = uv - \int u^{\prime}v \bigg)\\ \)


\( \begin{align} Var[X] &= \frac{2\sigma^2}{\sqrt{\pi}}\bigg\{ \bigg[-\frac{1}{2}\omega e^{-\omega^2}\bigg]_{-\infty}^{\infty} - \int_{-\infty}^{\infty} - \frac{1}{2} e^{- \omega^2} \mathrm{d} \omega \bigg\}\\ &= \frac{2\sigma^2}{\sqrt{\pi}}\bigg\{ \bigg[0 - 0\bigg] - \frac{1}{2}\sqrt{\pi} \bigg\}\\ &= \sigma^2 \end{align} \)



'Probability' 카테고리의 다른 글

Families of Discrete Random Variables  (0) 2017.03.22

작성중...


INDEX



Bernoulli Random Variable

Geometric Random Variable

Binomial Random Variable

Negative Binomial Random Variable (Pascal Random Variable)

Discrete Uniform Random Variable

Poisson Random Variable




Bernoulli Random Variable


\(X\) is a \(Bernoulli(p)\) random variable if the PMF of \(X\) h 

$$P_{X}(x) = \begin{cases} 1-p & \quad x = 0\\ p & \quad x = 1\\ 0 & \quad \text{otherwise}\\ \end{cases} $$

where the parameter \(p\) is in the range \(0 < p < 1\).

$$\text{Expected value} \quad E[X] = p.$$

$$\text{Second moment} \quad E[X^2] = p.$$

$$\text{Variance} \quad Var[X] = p(1-p).$$


ex) 성공할 확률은 \(p\), 실패할 확률은 \(1 - p\), \(X = \) 성공 또는 실패.



Geometric Random Variable


\(X\) is a \(Geomatric(p)\) random variable if the PMF of \(X\) has the form

$$P_{X}(x) = \begin{cases} p(1-p)^{x-1} & \quad x = 1, 2, ...\\ 0 & \quad \text{otherwise.}\\ \end{cases} $$

where the parameter \(p\) is in the range \(0 < p < 1\).

$$\text{Expected value} \quad E[X] = 1/p.$$

$$\text{Second moment} \quad E[X^2] = (2-p)/p.$$

$$\text{Variance} \quad Var[X] = (1-p)/p^2.$$


ex) 성공할 확률은 \(p\), 실패할 확률은 \(1 - p\), \(X = \) 성공할때까지 시행한 횟수(첫 번째 성공까지 걸린 횟수), \(P_X(x) = \) x번째에 처음으로 성공할 확률. \(E[X] = \) 성공할때까지 시행한 횟수의 기댓값. 각 시행은 독립임.



Binomial Random Variable


\(X\) is a \(Binomial(n, p)\) random variable if the PMF of \(X\) has the form

$$P_{X}(x) = \binom{n}{x}p^x(1-p)^{n-x}$$

where \(0 < p < 1\) and \(n\) is an integer such that \(n \geq 1\).

$$\text{Expected value} \quad E[X] = np.$$

$$\text{Second moment} \quad E[X^2] = np(np-p+1).$$

$$\text{Variance} \quad Var[X] = np(1-p).$$


ex) 성공할 확률은 \(p\), 실패할 확률은 \(1 - p\), \(X = \) 성공한 횟수, \(n = \) 시도한 횟수, \(P_X(x) = \) n번 시행했을때 x번 성공할 확률. \(E[X] = \) n번 시행했을때, 성공한 횟수의 기댓값. 각 시행은 독립임.



Negative Binomial Random Variable (Pascal Random Variable)


\(X\) is a \(NB(r; p)\) random variable if the PMF of \(X\) has the form

$$f(k;r,p)\equiv \Pr(X=k)={\binom {k+r-1}{k}}p^{k}(1-p)^{r}\quad {\text{for }}k=0,1,2,\dotsc$$

where \(k\) is number of successes, \(r\) is number of failures, and \(p\) is the probability of success.

$$\text{Expected value} \quad E[X] = pr/(1-p).$$

$$\text{Second moment} \quad E[X^2] = pr(pr+1)/(1-p)^2.$$

$$\text{Variance} \quad Var[X] = pr/(1-p)^2.$$

$$\text{Skewness (Third normalised moment)} \quad E[X^2] = \frac{1+p}{\sqrt{pr}}$$


ex) 성공할 확률은 \(p\), 실패할 확률은 \(1 - p\), \(k = \) 성공한 횟수, \(r = \) 실패한 횟수, \(P_X(x) = \) r+k번 시행했고, 마지막 시행이 실패면서 k번 성공할 확률.


위의 Negative Binomial Random Variable을 마지막 시행을 성공했을 경우로 바꾸고, 시행횟수를 \(x\)로 정하면, 다음과 같은 PMF를 만들 수 있다. 1번의 성공을 가정했기 때문에 k는 1보다 크거나 같아야 한다.

\(X\) is a \(Pascal(k, p)\) random variable if the PMF of \(X\) has the form

$$P_{X}(x) = \binom{x-1}{k-1}p^k(1-p)^{x-k}$$

where \(0 < p < 1\) and \(k\) is an integer such that \(k \geq 1\).

$$\text{Expected value} \quad E[X] = k/p.$$

$$\text{Second moment} \quad E[X^2] = k(1-p)+k^2/p^2.$$

$$\text{Variance} \quad Var[X] = k(1-p)/p^2.$$


ex) 성공할 확률은 \(p\), 실패할 확률은 \(1 - p\), \(k = \) 성공한 횟수, \(x = \) 시행한 횟수, \(P_X(x) = \) x번 시행했고, 마지막 시행이 성공이면서 k번 성공 했을 확률.



Discrete Uniform Random Variable


\(X\) is a \(discrete uniform (k, l)\) random variable if the PMF of \(X\) has the form

$$P_{X}(x) = \begin{cases} l/(l-k+1) & \quad x = k, k +1, k+2, \dotsc , l\\ 0 & \quad otherwise\\ \end{cases} $$

where the parameters \(k\) and \(l\) are integers such that \(k < l\).

$$\text{Expected value} \quad E[X] = (l + k)/2$$

$$\text{Variance} \quad Var[X] = (l - k)(l - k +2)/12.$$





Poisson Random Variable


\(X\) is a \(Poisson(\alpha)\) random variable if the PMF of \(X\) has the form

$$P_{X}(x) = \begin{cases} \alpha^xe^{-\alpha}/x! & \quad x = 0, 1, 2, \dotsc, \\ 0 & \quad \text{otherwise}\\ \end{cases} $$

where the parameter \(\alpha\) is in the range \(\alpha > 0\).

$$\text{Expected value} \quad E[X] = \alpha$$

$$\text{Variance} \quad Var[X] = \alpha$$

$$\text{Skewness (Third normalised moment)} \quad E[X^2] = \alpha^{-1/2}$$


ex) \(\alpha = \lambda T = \) T시간동안 이벤트가 발생한 횟수,\(\lambda = \) T의 단위에 대한 평균 이벤트 발생율.(T가 시간일 경우 단위 시간당 평균 이벤트 발생 횟수). \(T = \) 관찰한 시간, \(P_X(x) = \) 관찰 시간 T 동안 j번 이벤트가 발생할 확률.




Proof - Bernoulli



PMF of \(Bernoulli(p)\) is

$$P_{X}(x) = \begin{cases} 1-p & \quad x = 0\\ p & \quad x = 1\\ 0 & \quad \text{otherwise}\\ \end{cases} $$


Expected Value

\( E[X] = \mu_X = \displaystyle\sum_{x \in S_X} x P_X(x)\\ E[X] = 0 \times (1 - p) + 1 \times p = p\\ \quad \\ \therefore E[X] = p \)


Variance

\( Var[X] = E[X^2] - (E[X])^2\\ E[X^2] = 0^2 \times (1 - p) + 1^2 \times p = p\\ Var[X] = p - p^2\\ \quad \\ \therefore Var[X] = p(1 - p) \)



Proof - Geometric



PMF of \(Geometric(p)\) is

$$P_{X}(x) = \begin{cases} p(1-p)^{x-1} & \quad x = 1, 2, ...\\ 0 & \quad \text{otherwise.}\\ \end{cases} $$


Expected Value

\( \begin{align} E[X] &= \displaystyle\sum_{x = 1}^{\infty} x p(1-p)^{x - 1}\\ &= p + 2p(1-p) + 3p(1-p)^2 + \dots\\ \end{align} \)


\( \begin{align} (1-p)E[X] &= \displaystyle\sum_{x = 1}^{\infty} x p(1-p)^{x}\\ &= p(1-p) + 2p(1-p)^2 + 3p(1-p)^3 + \dots\\ \end{align} \)


\( \begin{align} E[X] - (1-p)E[X] &= p E[X] = p + p(1 - p) + p(1 - p)^2 + \dots\\ &= \displaystyle\sum_{x = 1}^{\infty} p(1 - p)^{x - 1} = \frac{p}{1 - (1 - p)} = 1\\ \end{align} \)


\(\therefore E[X] = \frac{1}{p}\)


Variance

\( \begin{align} E[X^2] &= \displaystyle\sum_{x = 1}^{\infty} x^2 p(1-p)^{x - 1}\\ &= p + 4p(1-p) + 9p(1-p)^2 + \dots\\ \end{align} \)


\( \begin{align} E[X^2] - (1-p)E[X^2] &= p E[X^2]\\ &= 1 + 2(1-p) + 2(1-p)^2 + 2(1-p)^3 + \dots\\ &= 1 + \frac{2(1 - p)}{1 - (1 - p)} = \frac{2 - p}{p} \quad \because 1 - p < 1 \\ \end{align} \)


\( E[X^2] = \frac{2 - p}{p^2} \)


\( \begin{align} \therefore Var[X] &= E[X^2] - (E[X])^2\\ &=\frac{2-p}{p^2} - \frac{1}{p^2}\\ &=\frac{1-p}{p^2} \end{align} \)



Proof - Binnomial



PMF of \(Binomial(n, p)\) is

$$P_{X}(x) = \binom{n}{x}p^x(1-p)^{n-x}$$


Expected Value

\( \begin{align} E[X] &= \displaystyle\sum_{x = 0}^{n} x p^x(1 - p)^{n-x}\\ &= \displaystyle\sum_{x = 0}^{n} \frac{n!}{(n-x)!x!}x \cdot p^x (1 - p)^{n-x}\\ &= \displaystyle\sum_{x = 1}^{n} \frac{n!}{(n-x)!(x - 1)!} p^x (1 - p)^{n-x}\quad \quad \because \binom{n}{0} 0 \times p^0 (1 - p)^{n}\\ &= \displaystyle\sum_{x = 1}^{n} \frac{n (n - 1)!}{(n-x)!(x - 1)!} p \cdot p^{x - 1} (1 - p)^{n-x}\\ &= np\displaystyle\sum_{x = 1}^{n} \frac{(n - 1)!}{(n-x)!(x - 1)!} p^{x - 1} (1 - p)^{n-x}\\ \\ &\text{Let } a = x - 1, b = n - 1 \text{ then},\\ \\ &= np\displaystyle\sum_{a = 0}^{n} \frac{b!}{(b - a)!a!} p^{a} (1 - p)^{b - a}\\ &= np \quad \quad \because \displaystyle\sum_{a = 0}^{n} \frac{b!}{(b - a)!a!} p^{a} (1 - p)^{b - a} = 1\\ \end{align} \)


\( \begin{align} \therefore E[X] = np\\ \end{align} \)


Variance

\( \begin{align} E[X(X - 1)] &= E[X^2] - \mu = E[X^2] - \mu^2 + \mu^2 - \mu\\ &= Var[X] + \mu^2 - \mu \end{align} \)


\( Var[X] = E[X(X - 1)] - \mu^2 + \mu\\ \)


\( \begin{align} E[X(X - 1)] &= \displaystyle\sum_{x = 0}^{n} x (x - 1) \binom{n}{x}p^{x}(1 - p)^{n - x}\\ &= \displaystyle\sum_{x = 2}^{n} x (x - 1) \binom{n}{x}p^{x}(1 - p)^{n - x} \quad \quad \because x (x - 1) \binom{n}{x}p^{x}(1 - p)^{n - x} = 0 \quad \text{ when } x = 0, 1\\ &= \displaystyle\sum_{x = 2}^{n} x (x - 1) \frac{n!}{(n-x)!x!} p^{x} (1 - p)^{n-x}\\ &= n(n - 1) p^2\displaystyle\sum_{x = 2}^{n} x (x - 1) \frac{(n - 2)!}{(n-x)!x!} p^{x - 2} (1 - p)^{n-x}\\ &= n(n - 1) p^2\displaystyle\sum_{x = 2}^{n} \frac{(n - 2)!}{(n-x)!(x - 2)!} p^{x - 2} (1 - p)^{n-x}\\ \\ &\text{Let } a = x - 2, b = n - 2 \text{ then},\\ \\ &= n(n - 1) p^2\displaystyle\sum_{a = 0}^{b} \frac{b!}{(b - a)!a!} p^{a} (1 - p)^{b - a}\\ &= n(n - 1) p^2 \quad \quad \because \displaystyle\sum_{a = 0}^{b} \frac{b!}{(b - a)!a!} p^{a} (1 - p)^{b - a} = 1\\ \end{align} \)


\( \begin{align} \therefore Var[X] &= n(n-1)p^2 - n^2p^2 + np = np - np^2\\ &= np(1 - p) \end{align} \)

'Probability' 카테고리의 다른 글

Families of Continuous Random Variables  (0) 2017.04.19

+ Recent posts