Author Options:

How does heating of nichrome wire work? Answered

I was thinking about a situation, but right now I've hit a dead end;
Ok, picture this:

There's a room with a length of nichrome wire which connects to a power supply.
(The power supply is constant voltage, and the length of nichrome wire will be constant throughout these scenarios)
If the room is at room temperature, and power is applied to the wire, it heats up to x degrees after 1 minute.

Here's what I wanted to know:
If the room was below room temperature, and power was applied, would it still heat up to x degrees after 1 minute?
If the room was at x degrees, what temperature would the wire get to (i.e, would it be higher or lower than the very first scenario) when power is supplied, after 1 minute?

Thanks in advance


The forums are retiring in 2021 and are now closed for new topics and comments.
Jack A Lopez
Jack A Lopez

Best Answer 9 years ago

Heat transfer is driven by temperature difference, and for simple situations of heat transfer by conductance (rather than radiation, or convection) the amount of heat power that flows, from the hot place to the cooler place, is directly proportional to the temperature difference. 

P = (Ythermal) * (T2-T1)

Where Ythermal is called thermal conductance.

For electrical heating devices, the heat power that flows out to the surroundings is equal to electrical power supplied.  So you can sort of take that equation for thermal conductance and rearrange it to say that heat flow (P) is actually causing a thermal gradient to appear.

(T2-T1) = Rthermal*P

Where Rthermal = 1/Ythermal, and is called thermal resistance.

That's what you get for heat transfer by conductance.  The equations for other mechanisms like radiative heat transfer, or convective, are more complicated. 

However because there is always an equilibrium, no matter what the process, I would expect that making the room a few degrees hotter will make the wire a few degrees hotter, and making the room a few degrees cooler will make the wire a few degrees cooler, no matter how much power is flowing into and out of the wire.  I.e. supplying power to the wire is going to effect a temperature difference between the wire and its surroundings.

More on thermal conductance/resistance:


9 years ago

Watts = volts x Current this will be constant for a given set of figures

How hot the wire actually gets will depend on how much heat is conducted away. Hotter air less conduction. The difference will be minimal for a thin wire.


Answer 9 years ago

A lot of the heat will be radiated away, so thickness is a significant factor (thinner wire = greater surface area per unit volume).

Plus, another large proportion of the heat will be lost through the cooling effects of the air - the more it moves (by breezes or convection), the greater the heat-loss.