Please can someone tell what is meant by this: The development of clocks based on atomic oscillations allowed measures of timing with accuracy on the order of 1 part in 10^14 , corresponding to errors of less than one microsecond (one millionth of a second) per year? First of all, I do not understand the meaning of what accuracy refers to here, and then what is the error of one microsecond per year? What is meant by it? Thank you.
They are discussing the probability that the atomic clock will run fast or slow over the course of one year. If a perfect clock keeps perfect time, after one year it will have a variance (or error) of 0 microseconds. The atomic clock may run fast or slow by 1 microsecond. This error would be very small for all daily life purposes, but could be very important, for example, when calculating across astronomical distances and velocities near the speed of light.
Join our real-time social learning platform and learn together with your friends!