With Instructables you can share what you make with the world, and tap into an ever-growing community of creative experts.
Tell us about yourself!
The photos really look like that is an action figure... Great costume and great photos!
I have to admit, english is not my native language, and when I first saw the first picture I said "no, probably cardboard is not what I think" because... Well, it doesn't look like cardboard anymore. It seems much more solid and professional. Very good job! And thanks for sharing
Without too many words, "I know one thing: that I know nothing" [Socrates]
The problem is that you can't "go for 10--15mA", because you can't set a current without a) a current supply or b) a resistor to try to set the current. The 1.5V battery circuit you propose is perfectly legit, but falls in my "cheap chinese-like circuits" class: good if you don't care about performances (your LED will not light completely, maybe your battery will not last its maximum life). And of course in this kind of circuits you will always have to use a lower voltage for the LED (1.5V for 2V leds, 2.5V for 3V leds), otherwise you will not be able to guarantee that it will not blow.In the end, I'm not stating that you can't power an LED without protections. I just want to point out that this, in my opinion, is a bad practice and should be avoided unless you are trying to use as few components as possible, as long as you are aware that in this case you will get very poor performances (for instance low light, shorter life or higher consumption with respect to results)
Well, since an LED (and diodes in general) are highly non-linear components, a small change in the voltage is translated into a high change in current. For instance this image shows the first graph I found searching on google. Passing from 3V to 3.1V will change the LED current from 1A to approx 1.5A, and at 3.2V it will be over 2A. And keep in mind that this varies with the temperature.Now, if you add a 0.1Ohm resistor in series, you will use approx 0.85A with 3V and 1A at 3.1V, going to around 1.2A at 3.2V.As you can see, without resistor small changes in the voltage will result in high changes in the current. If you have a 3V +- 5% source, you can get half the nominal current or double....So, according to my experience, there is no such thing as appropriate voltage. You can either choose to protect it, using at least a resistor to limit the current to avoid overstressing both the LED and the driver (or better by using a current source instead of a voltage one), or decide you don't care about shortening the LED and the driver life, like they do in cheap chinese-like products.Period.PS: maybe you are saying this because in some circuits you made you didn't use it with some voltage outputs. Please note, however, that if you didn't get twice the current the reason is that every driver has a so-called output impedance. This is the reason why you didn't blow up your circuit nor the LED, and I totally agree it works. But this is exactly my second case: you are using the internal resistance to mitigate the effect of non-linearity, making the driver dissipate the power internally. This will definitely shorten its life; I'd prefer using a dedicated resistor (of which I know the power rating) rather than dealing with an unknown value/unknown power rating parasitic resistance to keep the circuit working...
Sorry, I couldn't understand what you meant, since it appears to me the opposite as what you were saying before. What I wrote is that you can't use it as a fuse, or using it saying "well, this will fry before the output", because it is not guaranteed. Maybe the led fries, maybe the output. For sure I will always use a resistor in series, even a very low value one if the source voltage is too low, but I'd never put an LED directly to a constant voltage source
It's maybe the LED, maybe the output stage of the audio port.. Who knows? For sure I'll never put a LED on an output I don't know without protecting it (it = the output port) with a resistor or other current-limiting protections
Points 1 and 2, ok. Point 3. Well, for me wasting 95% or 85% of the input power in heat means wasting almost all the power. Mathematically speaking 85 < 95, but... Well, I would not change all of my house lamps to leds just because this is slightly better. Moreover there is more difference among different LEDs than between LEDs and other sources (the cheap LEDs on strips are unlikely to have an efficiency greater than 10%).Now, in my opinion saying that LEDs emit less heat because they waste a lower percentage of input power is misleading, because of course they do waste less energy, but the main reason is that LEDs produce much more light with respect to the power they use (not the input power, but the power they use).I'm just making a super-silly example: I switched from a low-end PC with Windows 7 to a high-end PC with Windows 10. My applications now run much faster than before. Now, looking at the benchmarks Win10 is slightly faster than Win7 (approx 5% faster), but don't you agree that saying "My applications run much faster because I changed OS" is misleading? Ok, it is mathematically faster, but maybe the reason is that the CPU is a high-end quad core CPU with hyperthreading instead of a slow old dual core one.That said, I totally agree with your point 0: this is getting really silly. So... I'll stop commenting here, also because the instructable was about a different subject. I'll continue saying that the way LEDs are so efficient is because they physically emit more light with the same power, you will continue saying that this is just because they waste less energy in heat, and we will both be happy..
No, you aren't. And as I already told you a lot of times, it's true that LEDs, for comparable light, emit much less heat. What I told you, is that this is just because the LEDs require much less power. Taking your example, a 60W incandescent bulb generates (about) 57W of heat, while a 60W LED generates about 51W. It's very much similar. The real advantage is that to produce the same light of a 60W inc. bulb you just need a 6W LED, generating 5.1W of heat.So the reason why usually you say that LEDs emit less heat is that, for a comparable light, the LED require much less power, not that they "dissipate a lower proportion of the electrical energy".I've tried to say this in every possible way I could, so if you could not understand that I'm saying "you are right, I'm just questioning that sentence because it was misleading", well, I'm just a master's degree electronic engineer with a PhD in electronics, not a linguistic, so I can't find any other way of saying this..
Well, I suppose this depends on what you call "efficiency". As for the electric efficiency, I admin I don't know the electric efficiency of fluorescence light, but you have to admit that a loss of 85% of the power in heat (which is valid for power leds, but I doubt that the cheap LEDs on the chinese strips have this value) is very similar to the 95% of incandescent light. Then, as I told earlier, LEDs have a much higher luminous efficacy, so 1W of electric power into the LEDs generate much (much) more light than 1W of electric power into a CFL or incandescent light.Again, I'm not saying that changing your lights to LED technology will not reduce your heat. I'm just pointing out that the reason is not that they "dissipate a lower proportion of the electrical energy", but because they better exploit the remaining part, and consequently you can reduce the input power.
In fact... LEDs have an electric efficiency of about 10%! This means that almost 90% of the electric power you give them is dissipated as heat. This varies a lot with the LED type and color (higher power leds are more efficient), but.. "dissipate a lower proportion of the electrical energy" is wrong ;) Then, since with that 10% they can emit a lot more light than other light sources, they light a lot with low power ;)
Join 2 million + to receive instant DIY inspiration in your inbox.
Download our apps!
© 2016 Autodesk, Inc.