Are high power resistors really neccessary on a benchtop PSU conversion?
I know that they require some current to just stay on however in designing a project for my school's Engineering Technology department I found that the heat generated by such a small resistance (Around 10 ohms) was unacceptably high. Originally I was looking at Instructables and this sitethis site for inspiration but all the cooling measures taken to prevent the high power resistor from becoming a hazard seemed rather silly.
A few calculations and experiments later with the 250 watt power supply and I determined that 160 ohm1 watt resistors and 1K 1/2 watt resistors were perfectly acceptable for the purpose of keeping the PSU awake and functioning. I connected one of each between each voltage and ground. According to calculations I can get away with dissipating a grand total of two watts or less spread across multiple resistors.
The current divider rule dictates that if you add resistances in parallel, the resulting resistance will be smaller meaning more current will flow through the overall circuit. However this increased current will divide itself across the parallel resistances according to the rule Ix= RtIT/(Rx+Rt).
The current through and power dissipated by the resistor you've soldered into the PSU will not change enough to be significant no matter how large or small a resistance you attach in parallel with it--with the exception of an effective short and what in God's name are you doing intentionally shorting the terminals of your bench-top PSU?
Now several months later, the PSU is still operating happily and powering multiple micro-controller projects on a display board. Therefore I can reliably conclude that the high-power 10 ohm resistors in many computer power supply conversions are probably a gratuitous waste of wattage. You can get away with using a higher resistance and a resistor that dissipates much less current.