Question on the aging of LEDs

If I ran a LED at some V,I within the specs - does the emitter show the same aging as running it at doubled current (exceeding maximum “safe” specification) with PWM at equal on/off sequences, i.e. at the same integrated current?

Background: I have some Nitecore Tubes equipped with LEDs that are driven out of specs at high-mode. Low-mode is implemented with a PWM-signal, I guess at the same amplitude as for the supposedly deadly high-mode. I wonder if those LEDs maintain their life expectancy when running in low-mode. I might be wrong with double current, likely, the high mode consumes much more energy than that.

It’s a good question, but I’m not aware it’s ever been tested. Based on the testing Cree does to estimate the lifespan of LEDs, junction temperature seems to be the primary factor in aging, but current also plays a role. So, PWM would eliminate the temperature part, but not the current. I expect using PWM would result in slightly reduce lifespans, compared to constant-current.

Unless you’re running this light 24/7, I doubt the reduced lifespan matters much. The LED will last longer than you will.

Hope so.

A visible difference between those 5mm LEDs is the thermal mass. The cathode has much more material in the original emitter and can be soldered onto a larger area of copper.

However, those lights are fun :-).

Thanks,

Thomas

I recall reading a post that very low voltages actually increase emitter wear faster than higher voltages within tolerance. Also, that pwm generally avoided the problem. I wish I could find the post.

Yeah, I recall that thread, or a similar one, over on CPF a couple of years ago. They were claiming that moonlight modes cause more wear on LEDs than running at high levels.

But, I call B.S. on that one, since I’ve run several lights (Zebralights, 4sevens) at moonlight levels for 1000’s of hours, primarily as low-level nighttime illumination. Also, LED’s are used in electronics that are on for years or even decades, without noticeable dimming.

Until I see a study that replicates the behavior of the original claim, I’m not believing it.

I agree, low voltage=more wear sounds very improbable.

You will only notice the dimming if you compare it to a new LED with the same specs though as it progresses extremely slowly.

I’m going to test it this weekend. I have 2 Thrunite Archers. One has been used for over 8,000 hours on moonlight as a nightlight. Among other general usage. The other one is new. There could be some variance in the emitters, so not conclusive, but would be interesting to compare total visual output.

What about a filter inductor on the output of the FET before the LED—would smoothing out the current prolong the life?

https://glamox.com/gsx/led-lifetime-and-the-factors-that-affect-it