Ask your own question, for FREE!
Mathematics 14 Online
OpenStudy (anonymous):

If T~Po(m) and E(T^2)=6, find a) the value of m b) P(X=0)

OpenStudy (anonymous):

@ganeshie8

OpenStudy (kirbykirby):

If \(T\sim \text{Poisson}(m)\), then the pdf is \[ P(T=t)=f(t) = \frac{m^te^{-m}}{t!}, m >0, t=0,1,2,...\] Now, \[E(T^2)=\sum_{\text{all }t}t^2f(t)=6\\ \\ ~ \\ = \sum_{t=0}^{\infty}t^2\frac{m^te^{-m}}{t!} \]

OpenStudy (kirbykirby):

Actually, if you already know the result for the variance and expected value for the Poisson distribution, you could directly say: \[Var(T)=m\\ E(T) = m\\ Var(T)=E(T^2)-[E(T)]^2=E(T^2)-(m)^2\\ \implies m = 6 - m^2\]

OpenStudy (kirbykirby):

If you can use that result directly.. you can still use the summation: \[ \large E(T^2)=\sum_{\text{all }t}t^2f(t)=6\\ \\ ~ \\ = \large\sum_{t=0}^{\infty}t^2\frac{m^te^{-m}}{t!} \\ \large = \sum_{t=0}^{\infty}(t^2-t+t)\frac{m^te^{-m}}{t!}\\ \large =\sum_{t=0}^{\infty}(t^2-t)\frac{m^te^{-m}}{t!}+\sum_{t=0}^{\infty}t\frac{m^te^{-m}}{t!}\\ \large =\sum_{t=2}^{\infty}t(t-1) \frac{m^te^{-m}}{t(t-1)(t-2)!}+\sum_{t=1}^{\infty}t\frac{m^te^{-m}}{t(t-1)!}\\ \large= e^{-m}m^2\sum_{t=2}^{\infty}\frac{m^{t-2}}{(t-2)!}+me^{-m}\sum_{t=1}^{\infty}\frac{m^{t-1}}{(t-1)!}\\ \large =e^{-m}m^2\sum_{t=0}^{\infty}\frac{m^t}{t!}+me^{-m}\sum_{t=0}^{\infty}\frac{m^t}{t!}\\ \large =e^{-m}m^2e^m+me^{-m}e^m\\ \large =m^2+m\]

OpenStudy (kirbykirby):

and so \(m^2 + m = 6\)

OpenStudy (kirbykirby):

b) I'm assuming P(X=0) is supposed to be P(T=0) ? I hope this is a typo, or otherwise there might be a random variable X defined in your problem. But assuming it's P(T=0): \[ \large P(T=0)=\frac{m^0e^{-m}}{0!}\], but of course you plug in the value of m you have found in a) :)

OpenStudy (anonymous):

Note that the parameter for a Poisson distribution must be positive, so you can ignore the negative root from the \(m\) quadratic.

OpenStudy (anonymous):

@kirbykirby it is written to find P(X=0). also, I'm lost at this step: Var(T)=m E(T)=m Var(T)=E(T^2)−[E(T)]^2=E(T^2)−(m)^2⟹m=6−m^2 how can Var (T) = E (T)? and why are they both equalled to m?

OpenStudy (kirbykirby):

Well we can't find P(X=0) if you don't know the distribution of X. There is no information about it in your question. As for the variance and expectation, they are indeed equal to each other. The Poisson distribution is the only distribution (I think) whose variance and expectation are equal to each other. And they happen to equal to the parameter (which is "m" in this case). This is a well-known result about the Poisson distribution. But, you can actually prove this yourself by using the definitions of expectation and variance (i.e.: \[ (E(T)=\sum_{t}tf(t)\\ Var(T)=E(T^2)-[E(T)]^2)\] But actually the proof I used above for \(E(T^2)\) above gives you almost all the tools to prove E(T) = m and Var(T) = m very easily. If you notice in the proof, on the 4th line, the 2nd summation \( \Large \sum_{t=0}^{\infty}t\frac{m^te^{-m}}{t!}\), s actually the definition of the expectation, E(T). And you know \(E(T^2)\), so the variance is just one small step away to figure out :)

OpenStudy (anonymous):

@kirbykirby thank you once again for your detailed explanation!

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!