I'm planning to build a Brushless DC motor, with inrunner perm mag rotor and stator on the outside. How can i determine the proper copper wire gauge for the stator windings? How much current will the motor pull? At first i thought it was just "Power = Voltage * current", so knowing the power and the voltage i should be able to figure the current. However, when the motor is spinning it generates back EMF, which counters the applied voltage and diminishes the current. At no load, the motor should draw no current for an ideal motor, and only a small current in real life to overcome friction and losses. In any case it would be much lower than the nominal current. So, for instance, a 1000w motor at 50v should draw 20A nominal, but in most situations it would be less than that? How much is the starting current, when you have to overcome inertia/tire friction? I know the motor can take higher currents for small amounts of time, before it gets too hot,but how can i determine what will be the "average" current. Is there a rule of thumb that motor designers use? Also, the amp rating for each AWG seems to vary a lot. Some places list a conservative rating that is used for home wiring. But other places list much higher currents. If the stator is on the outer side and exposed to air, how much current can the copper handle for each AWG? Any insight will be appreciated, thanks.