since increase resistivity decreases the current why is more heat produced? since current flow is small, shouldn't there be less heat? why more heat instead? this is the case of a heating element, the nichrome wire..
Sorry I can't fully understand your question. If you keep the voltage constant, and increase the resistance, the power will drop (P=U^2/R).
I think you may have to reword your question. The currents do not change in the (ideal) wire until it flows across the resistor. If we increase the resistance, we are only decreasing the current flowing out of the resistor. Resistors "convert the current" into heat energy. So, big resistors can convert a lot of current into a lot of heat and we are able to see a small current flow.
Depends what you're holding constant - if you're holding the voltage constant then you're right since P = V^2/R, but if you're holding the current constant then more heat is produced because P = I^2 R.
if the voltage is constant. so when R decreases the I increases. the hear produced is Q=I^2 R t=(U^2/R) t=UIt. assume that I make the t(time) the constant do not use the I^2 R t because there are two variables. voltafe is constant use (U^2/R) t. since R decreases, the Q increases use (UIt) since I increases, Q increases
Q means heat
Join our real-time social learning platform and learn together with your friends!