Standard error: a) Is always the standard deviation divided by the square root of n. b) Is a measure of the variability of a sample statistic. c) Increases with bigger sample sizes. d) All of the above.
standard error is define as: sigma/sqrt(n) the bigger the bottom the smaller the error
s, or s^2 is a measure of the sample variablilty
s/sqrt(n) is something you see alot as well but that has to do with unbiased parameters or something
so what do you suggest?
since it cant be all of the above; and go with the definition of standard error in the text
b?
there does seem to be a measure of ambiguity .... i was leaning towards a tho
standard error of the mean and standard error of the estimate seem to come up when you define standard error
and? the rest are wrong. right? i have to choose only one!
i dont know what the choice format is; can you choose more than one answer? "Standard error is a statistical term that measures the accuracy with which a sample represents a population. "In statistics, a sample mean deviates from the actual mean of a population; this deviation is the standard error."
no...only one choice..
http://stattrek.com/estimation/standard-error.aspx if its refering to standard error (of the whatever) ... then "a" might have some flaw to it
The standard error is the standard deviation of the sampling distribution of a statistic.[1]
id rather use a book :)
the variablity in a sample statistic
its value is calculated differently based on which statistic your assessing
ok..we will not waste more time.. choose one like tossing a coin. ;p
i choose B :)
here you are ;p// lets go to the other one!
what, are you just trying to cheat off of someone elses answer? im pretty sure we covered the options pretty well
Join our real-time social learning platform and learn together with your friends!