Ask your own question, for FREE!
Mathematics 20 Online
OpenStudy (kirbykirby):

I have in my problem that \(X_1,\ldots,X_n\) is a random sample from a distribution with probability density \(f(x; \theta)=\theta x^{\theta-1}, 0

OpenStudy (kirbykirby):

The WLLN (Weak Law of Large Numbers) states: If \(X_1,\ldots,X_n\) is a random sample from a distribution with \(E(X_i)=\mu\) and \(Var(X_i)=\sigma^2<\infty\), then \[\bar{X}_n=\frac{1}{n}\sum_{i=1}^n X_i \xrightarrow{p} \mu\]

OpenStudy (anonymous):

My instructor skipped over some of the details, so it's not as fresh in my mind as I'd like it to be... Here's an attempt at deciphering my notes. It doesn't look like the density functions of the \(X_i\) are important... (beta, by the way). Denote \(Y_i=-\log X_i\). Then you have to show that \(\bar{Y}_n=\dfrac{1}{n}\displaystyle\sum_{i=1}^nY_i \xrightarrow{p}E(Y)=\dfrac{1}{\theta}\), i.e. \[\lim_{n\to\infty}P\left(\left|\bar{Y}_n-\frac{1}{\theta}\right|>\epsilon\right)=0\] You know that \(E(Y_i)=E(\bar{Y}_n)=\dfrac{1}{\theta}\), \(V(Y_i)=\dfrac{1}{\theta^2}\), and \(V(\bar{Y}_n)=\dfrac{1}{n}V(Y_i)=\dfrac{1}{n\theta^2}\). By Chebyshev's inequality, you have \[P\left(\left|\bar{Y}_n-\frac{1}{\theta}\right|\ge\epsilon\right)\le\frac{V(\bar{Y}_n)}{\epsilon^2}=\frac{1}{n\theta^2\epsilon^2}\] The right side then approaches 0 as \(n\to\infty\).

OpenStudy (anonymous):

And that's all there is to it. I'm not sure where the law is explicitly applied, though.

OpenStudy (kirbykirby):

ohhhh I see thank you so much!!

OpenStudy (kirbykirby):

:D

OpenStudy (anonymous):

You're welcome!

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!