Voltage regulator Design Problem?
I am making a voltage regulator like for an automobile regulator for a DC generator.
(I'm replacing the old DC generator regulator on Pre 60's cars)
A battery is charged from a winding on a Generator (20V @ 60 Amps)
A voltage sensor on the battery voltage turns on and off a switch driving the Field winding.
The Field winding basically "turns on and off" the generator output.
The battery is connected directly to the generator. (Why and Why not is coming up)
But when the Genrator's output is lower than the Battery voltage, current will flow back from the battery to the generator winding and discharge the Battery. So I need some isolation between the generator output and the battery.
Alternators are easy since it has a bridge onthe output keeping the current from flowing backwards.
Old regulators used a relay which burned out sooner or later.
So easy, use a diode right? Just put a diode inseries with the generator to the batery.
By the way, the "open circuit" voltage on a generator is about 200VDC
The best diode I can find has a .7V drop at 60 Amps which is 42 Watts of heat to dissipate.
Ahhh, so use a FET, They can get down into the milliOhms right?
Except for the reverse polarity (and ESD) protecton diode always across the FETs defeat the purpose of a reverse polarity switch.
So who out there is smarter than me?
What goes inthe box marked ??
Any solutions accepted.
Ungefahrt (now neither young or fast)