Ask your own question, for FREE!
Probability 16 Online
OpenStudy (anonymous):

1)Let U, V, and W be independent standard normal random variables (that is, independent normal random variables, each with mean 0 and variance 1), and let X=3U+4V and Y=U+W. Give a numerical answer for each part below. a) What is the probability that X≥8? b) E[XY] ? c) var(X+Y) ? 2)Let X be a random variable that takes non-zero values in [1,∞), with a PDF of the form fX(x)=c/x^3,if x≥1,0,otherwise. Let U be a uniform random variable on [0,2]. Assume that X and U are independent. a) What is the value of the constant c? b) P(X≤U) c) Find the PDF of D=1/X. Express your answer in terms of d 3) Let N,X1,Y1,X2,Y2,… be independent random variables. The random variable N takes positive integer values and has mean a and variance r. The random variables Xi are independent and identically distributed with mean b and variance s, and the random variables Yi are independent and identically distributed with mean c and variance t. Let A=∑Xi and B=∑Yi. a) Find cov(A,B). Express your answer in terms of the given means and variances using standard notation. b) Find var(A+B). Express your answer in terms of the given means and variances using standard notation. 4) Manhole explosions (usually caused by gas leaks and sparks) are on the rise in your city. On any given day, the manhole cover near your house explodes with some unknown probability, which is the same across all days. We model this unknown probability of explosion as a random variable Q, which is uniformly distributed between 0 and 0.1. Let Xi be a Bernoulli random variable that indicates whether the manhole cover near your house explodes on day i (where today is day 1). a) E[Xi] b) var(Xi) c) Let A be the event that the manhole cover did not explode yesterday (i.e., X0=0). Find the conditional PDF of Q given A. Express your answer in terms of q using standard notation. For 0≤q≤0.1, fQ∣A(q)=? 5) Consider a fire alarm that senses the environment constantly to figure out if there is smoke in the air and hence to conclude whether there is a fire or not. Consider a simple model for this phenomenon. Let Θ be the unknown true state of the environment: Θ=1 means that there is a fire and Θ=0 means that there is no fire. The signal observed by the alarm at time n is Xn=Θ+Wn, where the random variable Wn represents noise. Assume that Wn is Gaussian with mean 0 and variance 1 and is independent of Θ. Furthermore, assume that for i≠j, Wi and Wj are independent. Suppose that Θ is 1 with probability 0.1 and 0 with probability 0.9. a) Given the observation X1=0.5, calculate the posterior distribution of Θ. That is, find the conditional distribution of Θ given X1=0.5. b) What is the LMS estimate of Θ given X1=0.5? c) What is the resulting conditional mean squared error of the LMS estimator given X1=0.5? 6) Let Θ be an unknown random variable that we wish to estimate. It has a prior distribution with mean 1 and variance 2. Let W be a noise term, another unknown random variable with mean 3 and variance 5. Assume that Θ and W are independent. We have two different instruments that we can use to measure Θ. The first instrument yields a measurement of the form X1=Θ+W, and the second instrument yields a measurement of the form X2=2Θ+3W. We pick an instrument at random, with each instrument having probability 1/2 of being chosen. Assume that this choice of instrument is independent of everything else. Let X be the measurement that we observe, without knowing which instrument was used. a) E[X] b) E[X2] c) The LLMS estimator of Θ given X is of the form aX+b. Give the numerical values of a and b.

OpenStudy (anonymous):

X is normal random variable with mean 0 and variance 3^2 + 4^2=25 Y is normal random variable with mean 0 and variance 1^2 + 1^2=2

OpenStudy (anonymous):

Some wishful thinking there, posting all those questions.

OpenStudy (anonymous):

Thanks eliassaab. I am new here SithsAndGiggles. I couldn't figure out how to ask multiple questions. :(

OpenStudy (anonymous):

No worries. Have you learned about the mgf? I think using it would make the first part of 1 very easy to do.

OpenStudy (anonymous):

mgf? umm...am not sure what that means. I am just a beginner. A detailed explanation would be very helpful. Hope I am not asking for too much.

OpenStudy (anonymous):

Moment-generating function? I assumed you'd heard of it, considering the last questions are about estimators.

OpenStudy (anonymous):

I am sorry. I do know a little about moment-genrating functions. But my concepts are not that clear yet I guess.

OpenStudy (anonymous):

That's alright, I was just going to suggest using the mgf of \(X\) to find its density function, as well as its mean/variance. Now that I reconsider it, eliassaab's method would be much simpler. You're given that \(X\) and \(Y\) are linear combinations of some known standard normal r.v.'s, and so \[E(X)=E(3U+4V)=3E(U)+4E(V)=0\\ E(Y)=E(U+W)+E(U)+E(W)=0\] \[\text{Var}(X)=\text{Var}(3U+4V)=9\text{ Var}V(U)+16\text{ Var}(V)=25\\ \text{Var}(Y)=\text{Var}(U+W)=\text{Var}(U)+\text{Var}(V)=2\] Now that you know \(X\) is normally distributed with mean 0 and variance 25 (and thus standard deviation 5), you can find the probability in part (a): \[P(X\ge8)=P\left(\frac{X-0}{5}\ge\frac{8-0}{5}\right)=P(Z\ge1.6)\] Refer to a table for the value.

OpenStudy (anonymous):

For the second part of 1, regarding find the expected value of the product \(XY\)... I think you can use the fact that \(U,V,W\) all follow the same distribution to condense the given equations to \[\begin{cases}X=3U+4V=7U\\ Y=U+W=2U\end{cases}\] Now, \(XY=14U^2\), so \(E(XY)=E(14U^2)=14E(U^2)\). The next step I have in mind is using a fact you may or may not know. Have you learned about the sum of squared standard normal r.v.'s? Namely, that the sum follows a chi-square distribution with degrees of freedom depending on the number of terms, i.e. \[\large\sum_{i=1}^nX_i\sim\chi^2_{\text{df}=n}\] In this case, since we only have one term, \(U^2\), then the expected value is simply the number of degrees of freedom; 1 in this case. So, \(E(XY)=14\).

OpenStudy (anonymous):

Next, for part (c), you have a fairly simple task. \[\text{Var}(X+Y)=\text{Var}(4V+4U+W)=16\text{ Var}(V)+16\text{ Var}(U)+\text{ Var}(W)=33\]

OpenStudy (anonymous):

Thanks a lot SithsAndGiggles. That solves the 1st question. Thanks a lot for explaining it so beautifully.

OpenStudy (anonymous):

What about 2,3,4,5 and 6?

OpenStudy (anonymous):

Don't know about chi-squared distribution. Will have to read up on that.

OpenStudy (kirbykirby):

For the second part of 1, @SithsAndGiggles , usually you cannot add random variables like that. For example, \(X+X\ne2X\). A simple counter example: Suppose \(X_1\sim N(0,1)\)and \(X_2\sim N(0,1)\) and \(X_1,X_2\) are independent. Then \(X_1-X_2\sim N(0,2)\). According to your method, this would give \(X_1-X_2 = 0?\). Recall that if \(X_i,i=1,2\) are independent \(N(\mu_i ,\sigma_i^2)\) r.v's, and \(a_i\) are constants, then \[ a_1X_1+a_2X_2\sim N(a_1\mu_1+a_2\mu_2,a_1^2\sigma_1^2+a_2^2\sigma_2^2)\]. Thus, \(X_1-X_2\sim N\left((1)(0)+(-1)(0),(1)^2(1)+(-1)^2(1)\right)=N(0,2)\) So, What you can do is: \(E(XY)=E[(3U+4V)(U+W)]=E[3U^2+3UW+4VU+4VW]\\ =3E[U^2]+3E(U)E(W)+4E(V)E(U)+4(V)E(W)\), by independence \(=3(Var(U)+[E(U)]^2)+3E(U)E(W)+4E(V)E(U)+4(V)E(W)\\ =3(1+0^2)+3(0)(0)+4(0)(0)+4(0)(0)\\ =3\)

OpenStudy (anonymous):

thanks a ton @kirbykirby !

OpenStudy (anonymous):

@SithsAndGiggles and @kirbykirby can you help me with 2,3,4,5 and 6 as well? It would be a huge help!

OpenStudy (kirbykirby):

For 2a) Recall that PDFs integrate to 1! that is \[ \int_{\forall x}f_X(x) \,dx=1\] So, you want \[\int_1^{\infty}\frac{c}{x^3}\,dx=1 \]Thease are the bounds of your integral since \(x\ge1\), which means \(x\in[1,+\infty)\) \[=c\int_1^{\infty}x^{-3}\,dx\\ \, \\ =c\frac{1}{-2} \left.x^{-2}\right|_1^{\infty}\\ =c\frac{-1}{2}\left(0-\frac{1}{1^2}\right)=\frac{c}{2}\] But this integral equals 1, so, \[1=\frac{c}{2}\implies c=2\]

OpenStudy (kirbykirby):

b) Normally this requires the joint PDF of X and U. In general: \[\large P\left( (X,U)\in D\right)=\iint\limits_{(x,u)\in D\cap A}f_{X,U}(x,u)\,dx\,du\] where \(A\) is the support of the joint PDF for \(X,U\), that is \(A=\{(x,u)|f_{X,U}(x,u)>0 \}\) and \(D\) is usually just the region imposed by some given inequality. But luckily,X and U are independent, so \(f_{X,U}(x,u)=f_X(x)f_U(u)\). We found the PDF of X above, now the pdf of U is just a uniform distribtuion, so it has pdf \(f_U(u)=\frac{1}{2}\) Thus, \[\large P(X\le U) =\iint\limits_{x\le u\cap A}f_X(x)f_U(u)\,dx\,du\] Drawing a figure always helps when you have these types of questions to find the appropriate integral bounds:|dw:1398151530949:dw| Thus, \[\Large P(X\le U)=\int_{u=1}^{u=2}\int_{x=1}^{x=u}\frac{2}{x^3}\cdot\frac{1}{2} dx \,du\\ =\frac{1}{2}\int_{u=1}^{u=2}\left(\int_{x=1}^{x=u}\frac{2}{x^3}dx\right)\,du\\ =\frac{1}{2}\int_{u=1}^{u=2}\left(\left.(-x)^{-2}\right|_1^u\right)\,du\\ =-\frac{1}{2}\int_{u=1}^{u=2}(u^{-2}-1)\,du\\ =-\frac{1}{2}\left[ \left.\frac{u^{-1}}{-1}\right|_1^2-\left.u\right|_1^2\right]\\ =\frac{1}{4}\]

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!