Ask your own question, for FREE!
Physics 8 Online
OpenStudy (anonymous):

It is claimed that two cesium clocks, if allowed to run for 100 years, free from any disturbance, may differ by only about 0.02 s. What does this imply for the accuracy of the standard cesium clock in measuring a time-interval of 1 s?

OpenStudy (vincent-lyon.fr):

This ratio is non-dimensional, but expressed with different units in num and denom. Convert to 'second per second' and you will get your answer.

OpenStudy (ganpat):

In 100 years.. 1 leap year every day in every 4 yrs.. so 24 leap years + 76 years 1 year = 365 days.. 1 leap year = 366 days... 24 (366) + 76 (365) = 36524 days.. Now calculating Number of seconds 36524 * 24 * 60 = 52594560 seconds in 100 years... So, 52594560 seconds ----> 0.02 sec error.. 1 second -----> x error.. So x = 0.02 / 52594560 = 3.802 e to the power -10.. Error per sec for Cesium clock = 3.802 e to the power -10.. Just a try...

OpenStudy (ganpat):

the error would be +/- 3.802 e to the power -10..

OpenStudy (ganpat):

is that right ??

OpenStudy (vincent-lyon.fr):

@Ganpat You have mistaken minutes and seconds in your calculation.

OpenStudy (ganpat):

ya, i guess.. need to multiply more by 60, however method would remain same.. anyways thanks @Vincent-Lyon.Fr ...

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!