You plug a string of 100 lights in series into a 120 V power outlet, and each light has a resistance of 3.00 Ω. If each light has a power rating of 0.50 W, what will happen? The string will remain lit. All the lights will go out. Only one light will go out. You cannot predict what will happen.
@ParthKohli sooo sorry to bug you, but could you please help me a little bit more?
thank you for coming...i know you are busy. I just have a pile of this stuff to get through by a couple of hours and I have no idea how to do it :(
\[\dfrac{V^2}{R} = \dfrac{1}{2}\]The resistance of each light is given as \(3 \Omega\), so each bulb is rated\[\dfrac{V^2}{3} = \dfrac{1}{2}\]\[V = \sqrt{3/2}\]
Use the Ohms law , my child
\[\huge V=IR\]
Do I multiply that by 100 and since it is 122...it will make all the lights go out? Or am I totally wrong?
We have voltage 120 right? 120 = (3*100) I , We get current as 0.4 A We would here be using the so-called voltage-divider rule Voltage drop of each lamp = 120/100 =1.2V Then you calculate the dissipated energy by the formulae P=RI^2 And then check the answer
The answer with the power equation is 0.48...thats less than the power rating so then all the lights are fine?
sorry if these are dumb q's
Join our real-time social learning platform and learn together with your friends!