Let X be a random variable. By expanding the expression E(X-E(X))^2, show that E(X^2) ≫ (E(X))^2 ........ How can I end up with an inequality from a simple expression?
paste the entire problem
That is it I am afraid. Hoping it is not a typo in the textbook. It says it is copyrighted to the IB as it's from a past HL Stats Option paper.
ok, we use the property that variance(Var) of a random variable is always positive. \(Var(X)=E[(X-E(X))^2], \: let, E(X)=\mu\\=E[X^2-2X\mu+\mu^2]\) now, mean is a constant, and Expectation ofa constant is a constant. \(=E[X^2]-2\mu E[X]+\mu^2......as,\: E[\mu^2]=\mu^2,\:\:E[\mu]=\mu\\=E[X^2]-2\mu^2+\mu^2=E[X^2]-\mu^2 \\As,Var(X)\ge 0, \\E[X^2]-\mu^2 \ge 0 \implies E[X^2]\ge (E[X])^2\)
Thank you very much indeed. A nice little solution to a foxy little question!
welcome ^_^
Join our real-time social learning platform and learn together with your friends!