Least count of a stop watch is 0.2 sec. The time of 20 oscillations of a pendulum is measured to be 25 sec. The max % error in measurement is:
As a rule you can interpolate one more digit than you have on an analog instrument, so if the stop watch has a dial and hands, you would report the time to two decimal places, and expect that your last decimal place was good to about +/- 0.05 s, for a percent error of 0.05/25.00 = 0.2%. However, if the stopwatch is digital, then as a rule you are uncertain to +/- 1 in the last digit. If the stopwatch measured to 1 decimal place, that is, your uncertainty would be +/- 0.1 s, for a percent error of 0.1/25.0 = 0.4%. However, you have an odd feature here, which is that the stopwatch goes up by 0.2s each tick, I think. If it's digital, that means you could be off by +/- 1 tick, which is +/- 0.2s, for a percent error of 0.2/25.0 = 0.8%. Unfortunately, without more information about this measuring instrrument, I can't tell which is the best answer.
Join our real-time social learning platform and learn together with your friends!