High level statistics (lets go with that)
the Statistics that i got for part a) are as follows\[(\sum_{i=1}^{n}\ln(x),\sum_{i=1}^{n}x_i)\]and i showed (hopefully i'm right) that this distribution can be expressed as a member of the exponential family, i then found the MLE to be\[\theta= r \sum_{i=1}^{n}x_i\]and showed that this is indeed a maximum point by taking the second derivative of the work prior to finding that, if my work up to this point is correct, i'm having trouble finishing from d) and on
I'm curious as to how you got two sufficient statistics... Did you use different methods? As for me, I'm getting one that's slightly different. Using the likelihood function and the factorization theorem, I have \[\large\begin{align*}L(\theta)&=\prod_{i=1}^n\frac{r}{\theta}{x_i}^{r-1}\exp\left(-\frac{1}{\theta}{x_i}^r\right)\\ &=\frac{r^n}{\theta^n}\left(\prod_{i=1}^n{x_i}\right)^{r-1}\exp\left(-\frac{1}{\theta}\sum_{i=1}^n{x_i}^r\right) \end{align*}\] The factorization theorem tells me that \(\sum {X_i}^r\) is the sufficient statistic.
My first big blunder came from my interpretation of the problem being written as\[\exp(-x*(\frac{ -r }{ \theta }))\]instead of what you opened my eyes too\[\exp( \frac{ x^r }{ -\theta })\]going to retry the problem now, thanks for that haha
Alright cool, so that makes my MLE using the sufficient statistic, which is also complete due to it being part of the Exponential Family comes out as\[\theta=\frac{ 1 }{ n}\sum_{i=1}^{n}x _{i}^{r}\]since my log likelihood function results in\[Log(L(\theta|X))=nlog(r)-nlog(\theta)-\frac{ 1 }{ \theta }\sum_{i=1}^{n}x _{i}^{r}\]and the derivative shows\[\frac{ -n }{ \theta }+\frac{ 1 }{ \theta^2 }\sum_{i=1}^{n}x _{i}^{r} \rightarrow \theta=\frac{ 1 }{ n }\sum_{i=1}^{n}x _{i}^{r}\]performing the second derivative test shows that this is indeed a maximum as well. For part d) it's obvious that this is the same as the sufficient statistic because the only difference is 1/n which is a one to one function on my interval. Hopefully i'm doing things right now
If you are still hovering around :) i was wondering if this is correct, to find\[E[X_{i}^{r}]\]i have to take the integral and that would look like\[\int\limits_{0}^{x}y^rf(y)dy\]where f(y) is my PDF that i am given at the beginning of the problem
Integration should be done over the support, so \[E(X^r)=\int_0^\infty x^r\cdot f(x)~dx\]
Ah my mistake, but on the same note, that does not make for a very nice integral\[\frac{ r }{ \theta } \int\limits_{0}^{\infty}X^{2r-1}\exp(-X^r/\theta)dx\]do you happen to know of any tricks that would make this a even a bit simpler to integrate
I believe there's a power reduction formula for this type of integrand. You can derive it by determining a pattern if you integrate by parts a bunch of times. The nice folks here have it at the ready: http://math.stackexchange.com/questions/46469/how-to-integrate-int-xn-ex-dx
I think also using the gamma function might be good way since you could use the substitution: \[u=\frac{x^r}{\theta} \] since then you get \(du = rx^{r-1}\,dx\) which is already in your integrand, and your bounds of integration will still be from 0 to infinity.
Please correct me if i'm wrong, but shouldn't it be that\[du=\frac{ rx^{r-1} }{ \theta }\]since we are deriving with respect to x?
oh yes i forgot the \(\theta\)
So the gamma function states that when\[\Gamma(t)=\int\limits_{0}^{\infty}x^{t-1}e^{-x}dx\]and my function with that substitution would be\[\frac{ r }{ \theta }\int\limits_{0}^{\infty}x^{2r-1}e^{-u}dx\]then that means that this integral comes out to being the gamma function expressed as,\[\frac{ r }{ \theta }\Gamma(2r)\]or is my integrating wrong and the theta should actually have disappeared.
I was thinking of doing it this way: \[ \large \int_0^{\infty}x^r\cdot \frac{1}{\theta}rx^{r-1}e^{\frac{x^r}{\theta}}dx\] Using the substitution: (and noticing that the \(x^r\) can be written as \(u\theta\): \[\large \int_0^{\infty}u\theta e^{-u}\ du\] Which is equal to \(\theta \Gamma(2)\) (since you have \(u^{2-1}\))
Join our real-time social learning platform and learn together with your friends!