Why do transformers have different power ratings?

I've noticed that wall warts have different current output than others with the same voltage, is there a internal resistor in them? If there is, why; why couldn't they have just put it into the object they want to power? Or is it the actual transformer causing the power loss? -Thanks

Xellers8 years ago
If you follow ohm's law, you know that the current that flows through a resistor (the load that you are powering) is a function of its own resistance and of the voltage between it.

V/R=I

where:
V= Voltage in volts
R= Resistance in ohms
I= Current in amperes

Therefore, the current that your load draws will always be the same. If your voltage is 12 volts and the resistance of the object is 12 ohms, then it draws 1 amp. So whether you use a 2 amp transformer or a 4 amp transformer, it would make no difference. The current flowing through the object would still be 1 amp. However, if the resistance of the load were to drop to 4 ohms, then the current that it would require would be 3 amps. The 2 amp transformer would try to provide 3 amps, but it would overheat and destroy itself. The 4 amp transformer, however, would provide 3 amps.

The actual construction of the transformer determines how much current and voltage it provides. The turns ratio between the primary and secondary coils determines the output voltage, and the actual number of turns determines the current. I know that this sound confusing, but lets say that you have a 120 volt outlet and two transformers each with a 10:1 turns ratio. If you plug either one of them in, when you get 12 volts on the output. However, lets say that one of them has 10 turns on its primary and 1 turn on its secondary, while the other one has 100 turns on its primary and 10 turns on its secondary. Both give 12 volts, but the latter transformer will be able to provide more current. If this transformer can provide more current, then chances are that it will also need a larger core to and it will need to be made from thicker wire so that it doesn't heat up.

The company that makes the load and the transformer knows all of this. Therefore, once they make the load, they calculate how much current it will draw. Then they make a transformer that can provide just slightly more than that current. If the load draws 1 amp at 12 volts, then a transformer that can provide 1.3 amps at 12 volts is made. This saves money for the company because a 1.3 amp transformer is cheaper to make than a 2 amp transformer and so on. Also, it needs to be able to potentially provide more than 1 amp because if the company were to make a 1 amp transformer for a load that consistently draw that much current, it would quickly overheat.
NachoMahma8 years ago
. The ones with the higher ratings are built with components designed for higher current. Wiring will be of a heavier gauge and parts, such as regulators, will be "beefier."