Ask your own question, for FREE!
Mathematics 8 Online
OpenStudy (kirbykirby):

An iid sample \(X_1,\dots\,X_n\) have pdf \(f(x;\theta)=\theta x^{\theta-1},\,\,\,\, 00\). The MLE is \[\hat{\theta}=\frac{-n}{\sum_{i=1}^{n}\log(X_i)}\]. I need to find the expectation of \(\hat{\theta}\).

OpenStudy (kirbykirby):

It suggests showing \(-\log X_i\)~EXP\((1,\theta)\), then \(-\sum_{i=1}^n \log X_i\)~GAM\((n, 1/\theta)\), which I did and is ok. The next step though says \[E\left[\left(-\sum_{i=1}^n \log X_i\right)^{-1}\right]=\left(\frac{1}{\theta}\right)^{-1}\frac{\Gamma(n-1)}{\Gamma(n)}=\frac{\theta}{n-1}\] I understand that the expected value for GAM\((n, 1/\theta)\) is \(n/\theta\), so I'm not sure how they got the line above?? Do we apply a 1-1 transformation Y=1/X, and then calculate the expectation by definition, or is there a faster way to do this?

OpenStudy (kirbykirby):

Well a transformation Y = 1/W, where W = \(-\sum\)log Xi, I should say.

OpenStudy (anonymous):

The question is very unclear.

OpenStudy (kirbykirby):

How is it unclear?

OpenStudy (kirbykirby):

@wio

OpenStudy (kirbykirby):

iid: independent and identically distributed, pdf: probability density function MLE: maximum likelihood estimator is this better?

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!