off-grid home lighting from solar panel and 12v battery?

I ran 3 parallel circuits DC all hooked to a circuit going to battery. each circuit had a 5w and a 3w high-powered LED and a 2w resistor.

I spaced the led's about 4 feet apart in each circuit and the lighting was spectacular. lit up 300+ sq. ft. with 6 led's. then one of the 5w

led's flickered, turned blue, and dimmed permanently.  I don't know what I did wrong. I thought led's would only use power available.

the guy at superbright led wanted to sell me a $20 direct current driver for each light, which would make the cost 50 times higher.

he said my math was ok and it should work. each light had it's own heat sink. any help greatfully appreciated

lemonie4 years ago
In my experience of questions about high-powered LEDs, current drivers are always recommended.
I know people with combined wind & solar systems which store in big lead-acid batteries. They run inverters and use 240VAC

iceng4 years ago
lets see your math and details of the leds please...?
NO. LED's will take what you give 'em and then more, unless you use the RIGHT resistor or supply.

DC drivers make a lot of sense for off-grid lights because they will be ~90-95%% efficient. Using a resistor will blow as much as 25% of the available power as heat.

Tell me what the diode forward voltage drop is, and I can tell you what resistor to use.

Also, high power LEDs need GOOD heatsinks, to get rid of their waste heat, which is not inconsiderable.

Despite the hype, LEDs are not the most efficient light source out there quite yet.