LED Heat Generation

Hello everyone,

This has come up on (way) more than one occasion, and I’ve done some back of the napkin types of calculations to explain it. Just recently, I did a little more than back-of-the napkin, but it was still based on a lot of assumption. Anybody heard of Nanolight?

So, the question - How much heat do LED’s produce. In other words, how efficient are they at converting Electricity into Light? Well, here, finally, are some numbers from a name we can trust - CREE! Yes, they have a whole document on heat-management, and one of the sections of that document deals with how much heat a LED generates to begin with.

So, without further delay, from CREE:
leDs generate visible light when current passes across the junction of the semiconductor chip. However, leDs are not 100% efficient; much of the power running through an LED is output as heat, which thus needs to be dissipated. Cree royal blue XLamp LEDs are over 50% efficient and white XLamp LEDs are over 40% efficient. That is, under normal operating conditions, approximately 50% to 60% of the input power is output as heat, while the rest of the input power is converted to light. To be conservative, assume leDs convert 25% of the input power to light and output 75% of the input power as heat. This estimate varies depending on current density, brightness and component, but is a good estimate for thermal design. equation 1 below shows how to calculate the thermal power.

P (t) = 0.75 * V (f) * I (f)

Equation 1: Thermal power calculation
where:
P (t) is the thermal power (W)
V (f) is the forward voltage of the LED (V)
I (f) is the source current to the LED (A)
The V (f) and I (f) can be measured directly or calculated from the pcT, so the thermal power can easily be calculated. This is the amount of power the system/heat sink must dissipate.

Document Source : http://www.cree.com/~/media/Files/Cree/LED%20Components%20and%20Modules/XLamp/XLamp%20Application%20Notes/XLampThermalManagement.pdf

Hope this is helpful!
PPtk

Great post, thank you. :)

This is a complicated question that i have also pondered but without any numbers to work with, what is also complicating it is human perception across different wavelengths varies, as well as white light is not a colour, but a combination of colours that Cree approximates with different wavelengths combined into one chip, and not very well with the higher binned LEDs, which i believe is causing the low CRI.
In time they will release high CRI lights with high efficiency, in the lab they have made some such lights, and it should improve as well over time, in 5-10 years at this rate i would assume we should be able to buy chips with 90 or greater CRI at 200 lm/w at 10 watts or more

I’m still amazed that these tiny MMxMM squares are able to survive some of the current/voltage that a few of the manufacturers - no less than some of the members here - introduce into the emitters.

Thanks again!

So it appears to be like I imagined. That you can work out the efficacy of a ‘white’ light source by comparing it’s lm/w value to the 251lm/w theoretical maximum?

So, with a CREE LED approaching 125lm/w, if it is run at the most efficient drive current should approach 50% conversion efficacy.

There would be a slight variance due to an LED not being an ‘ideal’ white 5800k light source, with bias for slightly green or slightly red tinted LEDs, but on principle you should be able to work it out from the lm/w you are getting?

not necessarily, the white cree leds are not a perfect white, they are an adequate white.
Does anyone have a perfect white graph to mirror the wavelength distribution graph in the xm-l and xm-l2 datasheets?

Are you suggesting that the theoretical maximum for a white light source composed of peaks at the peak sensitivity wavelengths for Short (Blue), Med (Green) and, Long (Red) wavelengths could have a theoretical maximum efficacy of over 251lm/w? That’s assuming that an ‘ideal’ white light source is necessarily defined as a full spectrum one.

I had considered that, but I don’t really know whether it’s true or not…

Other than that I would think small variations in average spectral wavelength wouldn’t have much of an effect.

Pilot, I’ve just seen this post in the other thread. Thanks for that, it’s an interesting different approach to working it out.

I’m guessing that the differences in calculation based upon the max theoretical efficacy for an ‘ideal’ white light source and LEDs must be to do with the ‘peakiness’ of white LEDs spectral distribution. Meaning that they could have a theoretical (not taking into account inefficiencies inherent in the construction process of the LEDs themselves) max efficacy above that of a full-spectrum white light source.

Thats what the sun puts out, is human vision equally weighted?

No, not at all.

From Bridgelux’s AN10 - Thermal Management of LED Arrays (highly recommended reading):

When voltage is applied across the junction of an LED, current flows through the junction generating light. It is a common misconception that LEDs do not generate heat. While essentially no heat is generated in the light beam (unlike conventional light sources), heat is generated at the junction of the LED Array and must be effectively managed. As LEDs are not 100% efficient at converting input power to light, some of the energy is converted into heat and must be transferred to the ambient. The amount of heat generated from the LED Array that must be transferred to the ambient may be conservatively estimated to be 85% of the power that is applied to the LED Array and is calculated by multiplying the forward voltage (Vf) times the current (If) times 0.85. This is described in Equation 1.

Where:
Pd =Vf *If*0.85 Equation 1: Thermal power to be dissipated
Pd is the thermal power to be dissipated Vf is the forward voltage of the device If is the current flowing through the device

The power calculation should be made for maximum dissipated power, which is computed using the Maximum Vf at the drive current desired. The Maximum Vf can be obtained from the electrical characteristics table in the Bridgelux LED Array Product Data Sheet for the array being used. Product data sheets are available at www.bridgelux.com.

Heat generated by additional sources, such as a power supply located near the LED Array, must also be managed. In order to reduce the size and cost of the thermal management solution, and to minimize the amount of additional heat added to the system, power supplies and other heat generating components should not be located in close proximity to the LED Array.

They also say that when designing an LED system, it’s best to assume all the power in gets converted to heat. You can’t go wrong that way…

How does the efficiency of leds compare to CFL and Incandescent?

Compared to Incandescent? MANY times better. While an LED might convert 15 to 50% of energy into light, an Incan converts 1% or less into light (and the rest into heat).

Compared to CFL? Depends on the LED. Could be a little better, could be a little worse. They’re in the same ballpark though.

PPtk

Yea, the theoretical is >251 lm/w considering that has been broken already.

http://www.cree.com/news-and-events/cree-news/press-releases/2012/april/120412-254-lumen-per-watt

In other posts, many people have correctly pointed out that one of the bothersome things about CFL’s is that they need to “warm up”. That when they are first turned on they need a little while to come to full brightness. While to some this may seem to be a disadvantage it actually is an advantage! To look at it another way, a CFL actually puts out MORE Lumens as it warms up.
Now a major problem with the LED. In other posts, many people have correctly pointed out that one of the advantages of LED’s is that they are “instant on” That when they are first turned on they are already at full brightness. However, as we already know, LED’s put out less light as junction temperature goes up, for some LEDs’s up to 30% To look at it another way, an LED actually puts out LESS Lumens as it warms up. LED’s need to be kept cool! This is the BIG problem

Of course LED’s for in home and commercial use are mounted to a heat sink and if you have ever looked at some of the brighter ones, they can be HUGE. These LED’s are intended to be able, in some cases to run 24/7 and perhaps in a shroud or inclosed fixture. I have seen LED lights that have several POUNDS of heat sinking for an LED chip, maybe perhaps and XML. In other words, the heat sinking is hundreds of times larger than the actual light source, and WAY more expensive.
Does anybody else think it absolutely obscene that such a small piece of high technology (the LED chip) needs to be mated to such a big brute of a low technology structure (he heat sink)? What a shame! Why is it that an LED needs so much heat sinking and CFL’s, Incandescents don’t?

It’s because CFL’s and Incans can run at higher temperatures and can dissipate more heat through radiation. Because LED’s can’t stand the heat, they have to run cooler and are not able to dissipate wasted heat through thermal radiation as effectively as other light sources. Until LED’s are able to run at elevated temperatures this will be a big problem for high output LED’s meant to be run 24/7. The solution is could be greater efficiency, but I think we are already at the highest theoretical limit. It might new material science to achieve this. (higher operating temperatures)

The amount of heat that can be dissipated through black body radiation is proportional to the surface area and the the 4th power of the temperature (degrees Kelvin so it would not be quite that dramatic). So being able to run at higher temperatures would go a long way towards increasing the ability to dissipate the wasted heat. Of course this would not be appropriate for flashlights, they are running hot enough already.

I have tested dozens of LED bulbs. Yes, the light output decreases as it warms up… usually less than 10. Their output is spec’d at their operating temp. You get more light than you paid for when yo first turn them on. There is no way you can notice the drop in output as they warm up. The light increase as CFLs warm up is quite noticeable. I have measured it at over 50 during the first 20 minutes.

We will probably be stuck with the large access of heat generated by power LED’s (and their need for expensive heat sinking solutions) until another technology ultimately comes along to replace it. Something other than LED. Since the price of many CFL’s is often heavily subsidized (in the US), they are easily found for less than $1 per bulb. When compared to a $30+ 1000 lumen 10-13 watt LED bulb, one would have to be crazy to even consider LED. With the potential advantage (but not always) of LED at those lumen levels, it would take years (with average use) to even come close to breaking even with the difference in power consumption. All of this assuming that they dont fail far before living that long… and thats another pathetic story unto its own.

With the newest high output high-bay and industrial LED lighting solutions, there will be some accumulated cost savings over HID in power consumption. The other, and potentially larger savings, might be realized in their higher reliability (maintenance labor and replacement parts are very expensive and inconvenient).

The efficiency curve with LED is flattening. Lets hope something new and exciting comes along to the consumer market soon. Even better, lets hope that heat sinking is not part of the future equation for high efficiency lighting.

>Since the price of many CFL’s is often heavily subsidized (in the US), they are easily found for less than $1 per bulb. When compared to a $30+ 1000 lumen 10-13 watt LED bulb, one would have to be crazy to even consider LED.

Unless, of course, you actually want spectrum that doesn’t gives you skin cancer and doesn’t washes colors out.

people whinged and whined about CFLs when they first came out and all their disadvantages compared to incan, so having the same happen with LED lighting now that CFLs are an established technology isn’t really that surprising.

Personally I can’t wait for LED lightbulbs to come down in price to what I can afford - no more warm up times (if they ever warm up outside in winter), dodgy CRI and fewer bulbs to replace. Their current price as new tech means the financial pay off doesn’t make much sense on the home scale for most people, but there are plenty of people for whom the other advantages more than make up for that. At an industrial scale, the pay off tends to be very quick, in the 2-3 year range, once you factor in maintenance as well as electricity costs over halogen/sodium/arc.