Will give medal! Need help with some stat problem: We have a random sample, and Xi, i=1,...,n are Gamma(α,β) random variables with known α, then a) find the posterior distribution of λ=1/β given X=(X1,...,Xn) if the improper prior distribution of λ is c/λ, where c is a constant. b) Find the Bayes estimator of β. c) Compare the Bayes estimatorof β in b) with the UMVUE (uniformly minimum variance unbiased estimator) of β. To be certain, we use the following gamma P.D.F.: \[\frac{x^{\alpha-1}e^{-x/\beta}}{\Gamma(\alpha)\beta^{\alpha}} \]
@Zarkon @ranga @mathmale @sourwing @kirbykirby @math&ing001 @rock_mit182 @jim_thompson5910 @surjithayer @ybarrap @BlackLabel
For the prior, since \(c\) is a constant, then \(\pi(\lambda)\propto\frac{1}{\lambda} =\lambda^{-1}\). Also, since \(\lambda=\frac{1}{\beta}\implies\beta=\frac{1}{\lambda}\) For a) i) The likelihood is:\[ \large\begin{align} L(\lambda)&=\prod_{i=1}^n\frac{1}{\Gamma(\alpha)\left(\frac{1}{\lambda}\right)^\alpha}x_i^{\alpha-1}e^{-x_i/(1/\lambda)}\\ &=\left(\Gamma(\alpha)\right)^{-n}\lambda^{n\alpha}\left(\prod_{i=1}^n x_i\right)^{\alpha-1}e^{-\lambda\sum\limits_{i=1}^nx_i}\end{align}\] ii) The posterior then is: \[\large\begin{align} \pi(\lambda|x)&\propto L(\lambda)\pi(\lambda)\\ &=\left(\Gamma(\alpha)\right)^{-n}\lambda^{n\alpha}\left(\prod_{i=1}^n x_i \right)^{\alpha-1}e^{-\lambda\sum\limits_{i=1}^nx_i}\cdot\lambda^{-1}\\ &\propto \lambda^{n\alpha-1}e^{-\lambda\sum\limits_{i=1}^nx_i}\\ &= \lambda^{n\alpha-1}e^{-\lambda\big/\frac{1}{\sum_{i=1}^nx_i}}, \text{kernel of a Gamma distribution}\end{align}\]Thus, \[\lambda|x\sim\text{GAM}\left(n\alpha,\frac{1}{\sum_{i=1}^nx_i}\right)\] But, we are interested in the estimator of \(\beta\), hence we're interested in the distribution \(\frac{1}{\lambda}\mid x\). iii) The distribution \(\frac{1}{\lambda}\mid x\): I'll denote \(\Lambda=\lambda|x\) and \(B=\frac{1}{\lambda}\mid x\) and their corresponding realizations as simply \(\lambda\) and \(\beta\) respectively. Then, \[ B=h(\Lambda)=\frac{1}{\Lambda}\implies h^{-1}\left(B\right)=\frac{1}{B}\] Then, using the 1-1 transformation theorem, \[\large \begin{align} f_B(\beta)&=f_{\Lambda}\left(h^{-1}(\beta) \right)\left|\frac{d}{dy}h^{-1}(\beta) \right| \\ &=\frac{1}{\Gamma(n\alpha)\left(\frac{1}{\sum_{i=1}^nx_i}\right)^{n\alpha}}\left(\frac{1}{\beta}\right)^{n\alpha-1}\exp\left\{ -\frac{\left( \frac{1}{\beta}\right)}{\frac{1}{\sum_{i=1}^nx_i}}\right\}\left|-\frac{1}{\beta^2}\right|\\ &=\frac{1}{\Gamma(n\alpha)\left(\frac{1}{\sum_{i=1}^nx_i}\right)^{n\alpha}}\left(\frac{1}{\beta}\right)^{n\alpha+1}\exp\left\{ \frac{\sum_{i=1}^nx_i}{\beta}\right\}\\ &=\frac{1}{\Gamma(n\alpha)\left(\frac{1}{\sum_{i=1}^nx_i}\right)^{n\alpha}}\beta^{-n\alpha-1}\exp\left\{ -1\Bigg/\beta\left( \frac{1}{\sum_{i=1}^nx_i}\right) \right\}\end{align}\]This is the distribution of our \(B\) random variable, and hence the distribution of the posterior of interest. ------------------------------ b) Now, for the support of the new distribution, observe that \(\lambda>0\implies\frac{1}{\beta}>0\implies\beta>0\) The estimate is the mean of the posterior, hence we find \(E(B)=E(\beta\mid x)=\) \[\large \begin{align}&=\int_{0}^{\infty} \beta\cdot\frac{\beta^{-n\alpha-1}}{\Gamma(n\alpha)\left(\frac{1}{\sum_{i=1}^nx_i}\right)^{n\alpha}}\exp\left\{ -1\Bigg/\beta\left( \frac{1}{\sum_{i=1}^nx_i}\right) \right\}\,d\beta\\ &=\frac{1}{\Gamma(n\alpha)\left(\frac{1}{\sum_{i=1}^nx_i}\right)^{n\alpha}} \int_0^{\infty} \beta^{-n\alpha}\exp\left\{ -1\Bigg/\beta\left( \frac{1}{\sum_{i=1}^nx_i}\right) \right\}\,d\beta\end{align}\]Let \[u=\frac{1}{\beta\left(1\Big/\sum_{i=1}^nx_i \right)}\implies \beta=\frac{1}{u\left(1\Big/\sum_{i=1}^nx_i \right)}\\ -du=\frac{\beta^{-2}}{1\big/\sum_{i=1}^nx_i}d\beta \\ \, \\ \text{change of bounds: at }\beta=0, u=\infty \text{ and at }\beta=\infty, u=0\] So,back to \(E(B)\) \[ \large \begin{align} &=\frac{-1}{\Gamma(n\alpha)\left(\frac{1}{\sum_{i=1}^nx_i}\right)^{n\alpha-1}} \int_{\infty}^{0} \left( \frac{1}{u\left(1\Big/\sum_{i=1}^nx_i \right)}\right)^{-n\alpha+2}e^{-u} \,du\\ &=\frac{1}{\Gamma(n\alpha)\left(\frac{1}{\sum_{i=1}^nx_i}\right)^{n\alpha-1}\left(\frac{1}{\sum_{i=1}^nx_i}\right)^{-n\alpha+2}} \underbrace{\int_{0}^{\infty} u^{n\alpha-2}e^{-u} \, du}_{\Gamma(n\alpha-1 )}\\ \, \\ &=\frac{1}{\Gamma(n\alpha)\left(\frac{1}{\sum_{i=1}^nx_i}\right)}\Gamma(n\alpha-1)\\ \, \\ &=\sum_{i=1}^nx_i\frac{\Gamma(n\alpha-1)}{(n\alpha-1)\Gamma(n\alpha-1)}\\ \, \\ &=\frac{\sum\limits_{i=1}^nx_i}{n\alpha-1},\text{this is the Bayes estimate for }\beta \end{align}\]Hence, the Bayes estimator is simply \[ \frac{\sum\limits_{i=1}^nX_i}{n\alpha-1}\] --------------------------------------------- c) To find the UMVUE, you can show that \(X\) belongs to the regular exponential family, which requires 3 steps: i) Write the pdf in the exponential form \[f(x)=c(\theta)h(x)\exp\left\{\sum_{j=1}^n \eta_j(\theta)T_j(x) \right\} \]So, \[ f(x)=\frac{1}{\Gamma(\alpha)\beta^{\alpha}}x^{\alpha-1}\exp\left\{-\frac{x}{\beta} \right\}\\ \implies c(\theta)=\frac{1}{\Gamma(\alpha)\beta^{\alpha}}, \,\, h(x)=x^{\alpha-1},\,\, \eta(\theta)=\frac{-1}{\beta}, \,\, T(x)=x\] ii) We can write the exponential form in the canonical exponential form by reparameterizing the pdf by defining \( \eta(\theta)=\eta\). Clearly, \(\eta\) and \(T(X)\) do not satisfy any linear constraints since they are scalars. iii) We find an open subset in \(\mathbb{R}^k, k=\dim(\eta)\implies k=1\) \[\eta= \frac{-1}{\beta}, \beta \in \mathbb{R}^+\implies \eta \in \mathbb{R}^-\]So the image of the parameter space is \(\mathbb{R}^-\) which is an open subset in \(\mathbb{R}^1=\mathbb{R}\). Thus, we conclude \(X\) belongs to the regular exponential family, and hence the natural sufficient statistic \( T(X)=\sum\limits_{i=1}^nT(X_i)=\sum\limits_{i=1}^nX_i\) is a complete sufficient statistic. So, to find the UMVUE of \(\beta\), we find \(g(T)\) such that \( E[g(T)]=\beta\) (from the Lehmann-Scheffé Theorem). \[ X_i\sim\text{GAM}(\alpha,\beta) \implies \sum_{i=1}^nX_i\sim\text{GAM}(n\alpha,\beta)\\ \text{so let } g(T)=\frac{\sum_{i=1}^nX_i}{n\alpha}=\frac{T}{n\alpha} \]So, \[\begin{align} E[g(T)]&=E\left(\frac{\sum_{i=1}^nX_i}{n\alpha} \right) \\ &=\frac{1}{n\alpha}E\left( \sum_{i=1}^nX_i\right) \\ &=\frac{n\alpha\beta}{n\alpha}=\beta \end{align} \] \[\boxed{\text{Thus, the UMVUE is }\large \frac{\sum_{i=1}^nX_i}{n\alpha} \\ \text{which is similar to Bayes' estimator }\large \frac{\sum_{i=1}^nX_i}{n\alpha-1}.}\]
Wow!
I'm going to agree with @douglaswinslowcooper and say "wow"! Very detailed answer!! Thank you sooooo much @kirbykirby . I would give a hundred medals if I could n_n
Join our real-time social learning platform and learn together with your friends!