What happens at the end of LED life?

They all say 100,000 hours or something ridiculous. (Assume the electronics live that long, probably a bigger assumption than LED.)

What they don’t say is, “the brightness starts diminishing from hour 1. 100,000 hours assumes you can stand it to be X% of the original.”

But how do they die?

What kills them?

Does the LED just get dimmer?

Does the yellow phosphor (assume a white LED) get dimmer (which would change the color balance)?
Does it get cloudy or opaque, or change colors?

If you overdrive or overheat them, what effect does that have?

Is it “you only get so many lumen-hours out of this LED, your choice between bright/short and dim/long”?

How often will the LED itself just conk out suddenly, like an incandescent bulb?

Just wondering. Topic for discussion I had not seen addressed.
100,000 hours is only 11 years :slight_smile:

wle

They get dimmer. In the datasheets one can usually find “L70” lifespan. This tells you how many hours on average it takes for this led to dim down to 70% of its inital brightness. At this point it still works though! Just a bit dimmer.

Heat kills LEDs! The higher the average temperature, the lower the lifespan.
Also what most people don’t know is that higher currents are actually better for LEDs than really low currents (at the same temperature). Very low currents drastically reduce the lifespan of an LED. This is something that is not stated in most datasheets.
German video on the topic:

wow, really… i would not have thought that!

{low current is bad}

wle

Interesting. Could you explain? I couldn’t find a figure in that presentation that explained it. And I don’t understand German.

Only sort of on-topic and doesn’t answer any of your questions, but I’ve had several household LED “bulbs” quit on me, and its always the driver/electronics and not the emitters. So yeah, the 100k hr lifespan quoted for most LEDs is meaningless for the average consumer because the other parts in the circuit are often the weaker link.

It’s too hard for me to follow what the speaker is saying…
But i’m suprised too about low current being ‘bad’ for the LED.
But what is low?
Is it less than half of the maximum recommended current?

Yes, that is very surprising. Is it referenced somewhere else?

It must be far less than half the maximum current, because Cree’s LM80 testing doesn’t show any indication that low current hurts the LED lifespan. It primarily shows that high temperature is by far the main culprit when it comes to degrading the LED. Current (high or moderate) appears to play very little part.

I’d like to see a study based on very low current. I’ve never heard of this. And based on solar lawn lamps (that use a dim LED), I don’t see it anecdotally. Also, many electronics use dim LED to display status or time, and they run continuously for years or decades.

Though perhaps the poster is referring to modern high-output LEDs. I’m a moonlight mode fan, so I’d be concerned if the speculation is true.

Sounds concerning, the ~5lm mode I use 99% of the time will slowly dim to 4lm? :D

WojtekimbieR: to alleviate your concerns, the 4lm will not likely occur until 10 years or so.

Maybemaybemaybe it has to do with 1 of 2 things.

1, it might collect moisture in the phosphors, and high currents (thus high heat) helps “boil off” the moisture. Kinda like occasionally running the engine of a car that’s kept in long-term storage, vs just letting it sit, collect condensation, and rust away internally.

2, repeated operation at lower currents might “pull” the innards of the crystal a certain way over time, and high heat helps anneal the innards and reduce the stress. Back to the car analogy, having an engine that just runs at idle forever, and is never “opened up” to shake loose all the crap that might’ve collected.

Other’n that, I’m stumped.

Well, since I’ve never heard of The_Driver’s speculation about low current causing LED aging, I’m going to file that info in the “likely not true” bucket. At least, until I see some study that proves otherwise. If it were true, you’d think it would show up in some manufacturer’s testing results, such as Cree’s LM80 testing.

So far, the only thing I’ve read that significantly reduces LED lifepspan is heat, Specifically, when the LED junction temperature gets above 100C. Of course, electronics can fail in lots of other ways too, and as others have pointed out, the driver is likely to fail before the LED noticeably dims.

From 13.16 to 16.20 he talks about how low current is responsible for dimming and color changes over time. much more so than high current. He shows three graphs to illustrate this.

My knowledge of electronics is very limited, and his Austrian(?) accent doesn´t make it any easier to understand. But i think that is what he is saying.

The guy is an expert working for Osram on these things. You would think that he knows his stuff when he holds a presentation on this topic. I just saw this video a few days ago and it was interesting, but seemed plausible.

In the video he states that this is something that manufacturers don’t tell you in their datasheets. That is a known problem. For example most manufacturers don’t state the luminance (cd/mm^2) of their LEDs even though it is crucial information (for throw). Osram is one of the few which do this.

If you look into the datasheets of some power LEDs you will see that they state rather high minimum currents. Maybe part of the reason is the reduced lifespan.

@walkintothenight: this is not my speculation! You’re dismissing it rather quickly…

From Cree’s testing, high current doesn’t appear to be much of a factor in dimming. They have no info on low current. Their testing shows that heat is the main culprit with dimming over time.

I just don’t see any evidence of low current affecting LEDs. If this was the case, wouldn’t we notice it in electronics that have “always on” LED indicators?

I’m dismissing it (at least for now) for 3 reasons:

1. I don’t understand German, so I can not evaluate what he is saying.

2. It’s a rather surprising and counter-intuitive claim. This requires good evidence to dispute it.

3. I have never seen such a claim anywhere else, nor can I find any reference to it using Google.

Nevertheless, it’s an interesting claim, so I would like to see some independent evidence to back it up.

No, because how low “low” is depends on the specific led type. A Cree XM-L has a different range compared to an indicator light.
Larger LEDs are usually more expensive compared to smaller ones. For an indicator LED nobody would use an LED that is underdriven. It makes no sense.
Similarily in flashlights nobody will make a light with an XM-L powered at 10mA in the highest mode. It just doesn’t make any sense.

I think this “problem” is not really a problem. Basically nobody will ever notice.

Was there a figure in the presentation that represents how much more low current degrades the LED? At this point we (the non German speakers here) have no idea even the order of magnitude of whatever this effect is.

I agree it’s probably not a problem, especially for us flashlight users; thousands of hours is a long time.

Well all I can say about lifespan is that my 10 year old fenix P2D that I EDC for 10 years, every single day. Was rated at 140 lumens new and still proudces 140 lumens today.

The low vs high current thing is interesting and on some level that I can’t quite put my finger on, makes sense. Just not sure why it does lol.

Maybe this is part of why PWM is used so extensively with LED’s? Thus allowing higher currents with lower outputs?

One possibility that comes to mind is that a higher current would like the LED up faster and “more completely” then a lower current.

Ever hear of an “Italian tuneup”? :smiley:

Back in the day, at least in my nabe, when your car starts coking up with carbon deposits, etc., from stop’n’go driving, an IT would be taking it on an open highway and running it wfo for as long as possible to burn off all those deposits.

Kind of instinctive, and it usually worked!

Mainly to prevent color-shift from high/low currents. Dunno if designers even knew about this phenomenon. I’ve played around with LEDs since dinosaurs walked the earth, and this is the first time I ever heard of it!

[quote=The_Driver]

But it makes plenty of sense for the lowest mode. My Zebralights typically use less than 5mA in moonlight, and I’ve run some of them in moonlight mode for many months (cumulative). Often I just use them as a nightlight, and sometimes don’t even bother to shut them off (since they’ll last a couple of weeks on a charged cell).

Granted, I’ve probably only used about 10,000 hours in moonlight mode, at most. But I still see no sign of dimming (on the high modes). If low mode was really more damaging than high mode, I should probably start to notice it.

That’s only anecdotal, so I’m not saying it’s proof. But until I see proof that moonlight mode really is damaging, it’s difficult to accept it as anything more than the speculation of one person. And since I don’t understand German, I’m not even sure if that’s what he said.

I also have a Sunwayman D40A n/w that has a ultra-ultra moonlight mode “feature” (actually it’s a bug). If I shut it off from moonlight, it goes into a very very low moonlight mode, that stays on indefinitely (until I use it in high mode or I change the battery). It stays this way for months sometimes, if I’m not using the light. Again, probably at least 10,000 hours in that mode. Since it uses just microamps, surely I’d see some damage by now.

Very low currents cause cancer.