how to get a standard deviation from a set of numbers, simple explanation please.
hmmm, its variance squared
ya, square root of variance
find the mean subtract the mean from all the points and square the results add up all the results ... and divide by how many there are if its a population, or by one less if its just a sample
then sqrt that result i think
var=\[\frac{\sum(x-\bar x)^2}{N(or\ n-1)}\] right?
sqrt that to get sd
i don't think its over n
i wouldnt know, i cant see if you got population or sample
and stand dev is the square root of variance
the sample has to do with the degrees of freedom i think
thats the one, sd = sqrt(var)
so, add up all the differences for each value from the mean and square the sum
then....
add up all the squares of the differences from the mean
divide it by the correct n or n-1 depending on technical stuff
then sqrt it
ahhh, ok
the sqrt at the end is spose to counteract the square at the start.
ok, i can do calc all day, can't stand stats
stats was fun last semester
Join our real-time social learning platform and learn together with your friends!