148Views6Replies

Author Options:

efficiency (not efficacy) of CPUs/computers? is all the power input technically converted into heat? Answered

It has recently been fairly hot where I live, especially in my room, where computer hardware seems to be increasing the temperature by a good 5-10*F over other rooms.

This got me thinking, and was wondering if any computer engineers/physicists may know the answer to this one. If I am running a power-hungry intel CPU, coupled with RAM and maybe SSD storage (I wish!) is all the power fed into my system being converted into heat? In other words, if my rig consumes a good 250W-400W, and there are no transducer devices drawing power (LEDs or lights, speakers/microphones, monitors, motors, heaters, phone chargers, etc,) and all the energy is used for data collecting, and calculating, then is *ALL* the energy converted to 250W-400W of thermal energy (heat)? or do the data operations themselves in fact require energy and maybe entropy or something plays a role? where it would seem that 400W of input yields an apparent 399.981W of energy output + data.

8 Replies

user
kelseymhBest Answer (author)2014-08-01

Yes. At a measurable level (are you really using six-significant-figure equipment to measure power?), all of the input energy becomes waste heat, and does so rather quickly.

Steve mentioned energy conservation and computation; that's a cool thing to read about in science magazines, but is irrelevant (femtojoules) to your question.

Select as Best AnswerUndo Best Answer

user
-max- (author)kelseymh2014-08-01

Lol no, it is just an example. That is me implying that there is microwatts of power "disappearing" or being converted into data manipulation.

This does not have to be a computer as an example, it can be any computation device, also including mechanical systems like gears, cogs, etc.

Select as Best AnswerUndo Best Answer

user
kelseymh (author)-max-2014-08-01

You should look up the relevant articles in scholar.google.com (that will get you actual peer-reviewed science, instead of wacko Internet stuff). Yes, there is a relationship between energy and computation; specifically, the relationship is in _erasing_ data, not in computation per se. However, quantitatively the amount of energy involved is immeasurably small for any practical system (like your PC).

Select as Best AnswerUndo Best Answer

user
steveastrouk (author)kelseymh2014-08-01

Presumably quantum computation is the limiting case ?

Select as Best AnswerUndo Best Answer

user
iceng (author)2014-08-01

Don't include the display device because it is a power hog !

And then you are sending all those watts out your network to keep the WWW from freezing !

Not to mention the hand warmer pointer thingy.

BTW quantum computation does not all happen in the same dimension as your overheated room...

Select as Best AnswerUndo Best Answer

user
thematthatter (author)2014-08-01

A way to find out although probably not 3 sig fig accurate is to remove all the fans and submerge your computer in oil and measure the temperature before and after.

That way you wont have to factor in the 2 to 4 watts that the fans will draw.

But my guess is that its not putting out 400 watts of heat, that would be more than 6 60w lightbulbs in a room.

Select as Best AnswerUndo Best Answer

user
-max- (author)thematthatter2014-08-01

So you are thinking of a large calorimetry experiment? That sounds feasible! Although that would also factor in impedances, galvanic (resistive) losses, etc. It would be nice if I can seperate those out too!

Select as Best AnswerUndo Best Answer

user
steveastrouk (author)2014-08-01

It is theorised that there is a consumption of energy in the process of computation, but I don't think it's been measured yet. It makes sense, if you understand Claude Shannon's communication theorems

Select as Best AnswerUndo Best Answer