Author Options:

Drop 0.1 - 0.2 volts DC? Answered

I am building a charger for my apple device, which requires exactly 5V (According to the internet) I am using a voltage booster to get from 3.7V to 5V but the booster outputs about 5.1 - 5.2V. I need a way to drop either 0.1V or 0.2V using a resistor or something. anyone know which one to use?



I have same question. I had bought an adaptor and its output voltage is
5.17 v. For my circuit it must be exact 5v. The amperage of my circuit
is about 1A. So its not good idea to devide voltage with res. and also the Vin(min) for 7805 regulator is 7v

Your device will internally regulate the input voltage and/or current to whatever it needs to charge it's battery, so it is not likey to be too picky about +/-0.2V. As others have pointed out, the voltage drop across a series resistor will vary with the current through it (which will vary considerably in this application). However, a diode in series with your charger output will have a fairly constant voltage drop over the entire range of current your device will draw. A "Schottky diode" has a typical "forward voltage drop" of about 0.2V (whereas a standard silicon diode drop is about 0.6V). Pick a Schottky diode with a "maximum forward current" limit high enough to accomodate your charger circuit maximum current output (with a little extra head room) and a "peak reverse voltage" limit of 10V or greater (not too critical, just in case polarity gets reversed).

At the risk of stating something obvious...

If you place a 1 ohm resistor in between your device and its approimately-5-volt-charger, and then place the probes of a voltmeter across that 1 ohm resistor, for to measure the voltage across it, then this could serve as a way to measure the charging current, which is I=Vresistor/R = (Vresistor)/(1 ohm), or 0.1 volts per every 0.1 amperes = 0.1A =100 mA of charging current.

Also dropping the voltage a little bit, is maybe something you wanted to do anyway.

I can't guarantee the charging current will be constant. I mean it will be likely be different depending on how much current your i-Whatever decides it wants to draw, e.g. like 200 mA when its really charging itself really hard, and only 10 mA when it is "full" and goes into sleepy mode. Both those numbers are of course wild guestimates. I guess if you want exactness, you could take those measurements yourself.

I suspect your Apple(r) device, whatever it is, is designed to break, to obsolete itself, in about six months, no matter how nicely you treat it.

So I suggest you just go with 5.2 volts, and que sera sera, because you already own this awful device, made by that awful company, so I guess trying to charge it and use it, I guess that's really just making the best of a bad situation...


I think some of my prejudices against Apple(r) are starting to leak into this comment.


Seriously though, I'm just going to echo the sentiments of those who say the extra 0.2 volts probably won't hurt anything, and also echo Steve's advice of measuring the voltage output with an appropriate sized load attached, because that (better) measurement probably will be a few tenths of a volt lower than the open-circuit (zero current) voltage measurement.

Yeah, apple products want really "clean" energy. I think Steve's comment is good idea. as well as ATU.


4 years ago

Usb works on +- 0.25V (4.75V-5.25V) also you could use diode with 0.2V drop

Did you measure 5.1 V when the load was connected ?

The voltage difference isn't going to do any harm.

you should use a voltage regulator IC they are more stable/acurate