"Suppose that you measure the resistance of a resistor sing a voltmeter, a current meter, and a battery. Suppose that the voltmeter reads 0.2% high and the current meter reads 0.3% low. What will be the percent error in your measurement of resistance?" I'm confused on how to solve this problem. If someone could walk me through this, I'd really appreciate it.
just a guess/start: but Ohms law to begin with? V= IR so R = V/I
Okay, that's a good place to start . . . I understand Ohm's law, it's just figuring out this percent difference that's really confusing me.
i guess (V * 1.02)/(i * 0.97) 1.02/0.97 =1 and 5/97 or 1.0516...ish
does that make sense?
I think it does, actually! I'm going to try it. Thank you so much :D
all good, hope it works man =)
Join our real-time social learning platform and learn together with your friends!