I know how to calculate resistive losses, but do different types of resistors have different amounts of losses (power in vs power-out, in watts) From my own observations, connecting a 10 ohm resistor across a barrey will cause significant heating in both the resistor and battery, and cause the wires to get warm. All of this is resistive loss, but when I exchange the resistor for a 10k resistor, there is virtually no heating at all. What if I use a 0 ohm resistor (direct short). The only thing getting hot would be the power supply, due to internal resistance. Does this mean higher resistance is less lossy and by definition, more efficient, or is this simply due to the fact that there is less current flow, and less power loss, and efficiency (% of power loss) With an ideal constant current source, will the losses though any resistive load be equal? ( X amount of watts lost/dissipated @ 1A) Is it possible to limit current like a resistor without losses? (I know PWM techniques are more efficient, but I want actual resistance rather than chopping current flow and filtering with an inductor/capacitor RC filter)