why do we have to use current source when it does not exist? naturally only voltage source exits.


sort by: active | newest | oldest
-max-2 years ago

No, perfect voltage sources nor perfect current sources exist. Everything has what might be called as output impedance (or Z). Think of output Z like a hypothetical but apparent phantom resistor in series with the output from some ideal voltage source. Sometimes this may even be called ESR, or Equivalent Series Resistance.

Something that has a really low output Z would act like a voltage source, such as big lead acid batteries or LiPo's, for example. Something that has a really high output Z (approaching infinity in fact) would act more like a current source. In fact, if you had a supply voltage that was approaching infinity and the Z that was some multiple of that voltage, then you would indeed have a "ideal" theoretical current source based on the ratio of that voltage to that Z.

However, most real-world supplies are neither, as they have some finite voltage output and output Z, as Infinity and zero resistance to not really exist. the idea of Z in this sense is more of a mathimatical concept than anything but it's useful to understand this in power supply design, impedance matching or bridging, and many other things. In fact, the same core concepts apply to ALL forms or energy, let it be thermal (temperature, heat, and thermal resistance), fluid-dynamics (pressure, flow, and backpressure), etc etc etc.

You can just as easily configure a power supply to have a "constant current" output or "constant voltage" output, or even both, where when the load attempts to draw more current than the power supply current limit is set to, the voltage will sag such that the current is regulated to what has been set. In fact, this is what most lab power supplies do! Just set the maximum voltage and maximum current limits and whether if it is in CV or CC mode depends on the input Z of the load!

-max- -max-2 years ago

Hopefully all that made sense, to answer your question more directly, the simple answer is that that is simply not true. We tend to use constant voltage-ish power supplies because it is just kind of an arbitrary convention, however constant current sources are also used a LOT for many applications as well, just look inside of any analog amplifier and you will see current sources and current sinks, and current mirrors, as well as inside of digital chips as well!

LED drivers are are almost always Constant Current because powering them with Constant voltage is not good for a few reasons. For one thing, LEDs are super picky about what voltage is across them. even half a volt too high and you might blow the LED, and 1/2 volt too low and it may not even come on! Another reason is because as the LED warms up, the really small voltage tolerance needed will shift around, meaning as it warms up, the Vdrop gets lower, and the current gets WAY higher, and you can see where this will go. That is why the LED driver that I pulled out of a former street lamp has a current output rating of 1500mA, and a output voltage rating of 120V---480V.

seandogue2 years ago

A current source is used when one wants to control using a current focused paradigm. For instance, when handling *current-dependent devices, (diodes, certain sensors, etc) it makes more sense to operate from the perspective of a current source, rather than a voltage source.

It's consistent values that change the world from random to predictable. By using a current source, we contain some or most of that randomness. allowing us to predict the behavior of the device with higher accuracy, thereby allowing us to produce useful, predictable items and take useful, accurate data.

Grok?

rickharris2 years ago

Current (measured in amps) is the movement of electrons from atom to atom. (here I assume a Niels Bohr of the atom).

This movement is what we call electrical flow.

Voltage is a potential across a resistance that is the result of that flow in much of a similar way as water pressure can be measured across a restriction to the flow.

electrons are negatively charged and so current will flow from a negative charges towards a positive charge.

https://www.instructables.com/id/How-electricity-an...

All of these "image systems" only allow us to better visualize what is happening and are based on observation. They may well bear little connection to the reality (IF we could observe that)

If you subscribe to Quantum theory you will have to make up your own images to explain it!

That should say

"Niels Bohr model of the atom"

Current sources exist. They're real. I seen 'em.

+1

As I recall they are blue green...

You get unique, usable, electrical properties from a constant current source. And any sufficiently high voltage with a sufficiently high output resistance is, for all intents and purposes, a constant current source.

There is no perfect voltage source, there is no perfect current source.

As an example, you can't drive LEDs with constant voltage, only constant current.