A set of data has a mean of 12 and a standard deviation of 3. A data point of the set has a z-score of 1.3. What does a z-score of 1.3 mean? The data point is 1.3 standard deviations away from 3 The data point is 1.3 standard deviations away from 12. The data point is 3 standard deviations away from 1.3. The data point is 3 standard deviations away from 12.
The second option. The data point is 1.3 standard deviations away from 12.
How?
The formula for z-score is (point - mean)/standard deviation. What the z-score calculates is how far away a point is from the mean. Let's say we selected a point that had a value of 15. Well that would mean with the z-score formula you'd do (15-12)/3 which equals 1. Meaning the z-score is equal to 1 which makes perfect sense because 15 is exactly 1 standard deviation away from the mean. So something that has a z-score of 1.3 means it is 1.3 standard deviations away from the mean. Since the mean is 12, that means a point with a z-score of 1.3 is 1.3 standard deviations away from 12. Does that make sense?
Yes it does. Thank you so much
Join our real-time social learning platform and learn together with your friends!