LED Heat Generation

Are you suggesting that the theoretical maximum for a white light source composed of peaks at the peak sensitivity wavelengths for Short (Blue), Med (Green) and, Long (Red) wavelengths could have a theoretical maximum efficacy of over 251lm/w? That’s assuming that an ‘ideal’ white light source is necessarily defined as a full spectrum one.

I had considered that, but I don’t really know whether it’s true or not…

Other than that I would think small variations in average spectral wavelength wouldn’t have much of an effect.

Pilot, I’ve just seen this post in the other thread. Thanks for that, it’s an interesting different approach to working it out.

I’m guessing that the differences in calculation based upon the max theoretical efficacy for an ‘ideal’ white light source and LEDs must be to do with the ‘peakiness’ of white LEDs spectral distribution. Meaning that they could have a theoretical (not taking into account inefficiencies inherent in the construction process of the LEDs themselves) max efficacy above that of a full-spectrum white light source.

Thats what the sun puts out, is human vision equally weighted?

No, not at all.

From Bridgelux’s AN10 - Thermal Management of LED Arrays (highly recommended reading):

When voltage is applied across the junction of an LED, current flows through the junction generating light. It is a common misconception that LEDs do not generate heat. While essentially no heat is generated in the light beam (unlike conventional light sources), heat is generated at the junction of the LED Array and must be effectively managed. As LEDs are not 100% efficient at converting input power to light, some of the energy is converted into heat and must be transferred to the ambient. The amount of heat generated from the LED Array that must be transferred to the ambient may be conservatively estimated to be 85% of the power that is applied to the LED Array and is calculated by multiplying the forward voltage (Vf) times the current (If) times 0.85. This is described in Equation 1.

Where:
Pd =Vf *If*0.85 Equation 1: Thermal power to be dissipated
Pd is the thermal power to be dissipated Vf is the forward voltage of the device If is the current flowing through the device

The power calculation should be made for maximum dissipated power, which is computed using the Maximum Vf at the drive current desired. The Maximum Vf can be obtained from the electrical characteristics table in the Bridgelux LED Array Product Data Sheet for the array being used. Product data sheets are available at www.bridgelux.com.

Heat generated by additional sources, such as a power supply located near the LED Array, must also be managed. In order to reduce the size and cost of the thermal management solution, and to minimize the amount of additional heat added to the system, power supplies and other heat generating components should not be located in close proximity to the LED Array.

They also say that when designing an LED system, it’s best to assume all the power in gets converted to heat. You can’t go wrong that way…

How does the efficiency of leds compare to CFL and Incandescent?

Compared to Incandescent? MANY times better. While an LED might convert 15 to 50% of energy into light, an Incan converts 1% or less into light (and the rest into heat).

Compared to CFL? Depends on the LED. Could be a little better, could be a little worse. They’re in the same ballpark though.

PPtk

Yea, the theoretical is >251 lm/w considering that has been broken already.

http://www.cree.com/news-and-events/cree-news/press-releases/2012/april/120412-254-lumen-per-watt

In other posts, many people have correctly pointed out that one of the bothersome things about CFL’s is that they need to “warm up”. That when they are first turned on they need a little while to come to full brightness. While to some this may seem to be a disadvantage it actually is an advantage! To look at it another way, a CFL actually puts out MORE Lumens as it warms up.
Now a major problem with the LED. In other posts, many people have correctly pointed out that one of the advantages of LED’s is that they are “instant on” That when they are first turned on they are already at full brightness. However, as we already know, LED’s put out less light as junction temperature goes up, for some LEDs’s up to 30% To look at it another way, an LED actually puts out LESS Lumens as it warms up. LED’s need to be kept cool! This is the BIG problem

Of course LED’s for in home and commercial use are mounted to a heat sink and if you have ever looked at some of the brighter ones, they can be HUGE. These LED’s are intended to be able, in some cases to run 24/7 and perhaps in a shroud or inclosed fixture. I have seen LED lights that have several POUNDS of heat sinking for an LED chip, maybe perhaps and XML. In other words, the heat sinking is hundreds of times larger than the actual light source, and WAY more expensive.
Does anybody else think it absolutely obscene that such a small piece of high technology (the LED chip) needs to be mated to such a big brute of a low technology structure (he heat sink)? What a shame! Why is it that an LED needs so much heat sinking and CFL’s, Incandescents don’t?

It’s because CFL’s and Incans can run at higher temperatures and can dissipate more heat through radiation. Because LED’s can’t stand the heat, they have to run cooler and are not able to dissipate wasted heat through thermal radiation as effectively as other light sources. Until LED’s are able to run at elevated temperatures this will be a big problem for high output LED’s meant to be run 24/7. The solution is could be greater efficiency, but I think we are already at the highest theoretical limit. It might new material science to achieve this. (higher operating temperatures)

The amount of heat that can be dissipated through black body radiation is proportional to the surface area and the the 4th power of the temperature (degrees Kelvin so it would not be quite that dramatic). So being able to run at higher temperatures would go a long way towards increasing the ability to dissipate the wasted heat. Of course this would not be appropriate for flashlights, they are running hot enough already.

I have tested dozens of LED bulbs. Yes, the light output decreases as it warms up… usually less than 10. Their output is spec’d at their operating temp. You get more light than you paid for when yo first turn them on. There is no way you can notice the drop in output as they warm up. The light increase as CFLs warm up is quite noticeable. I have measured it at over 50 during the first 20 minutes.

We will probably be stuck with the large access of heat generated by power LED’s (and their need for expensive heat sinking solutions) until another technology ultimately comes along to replace it. Something other than LED. Since the price of many CFL’s is often heavily subsidized (in the US), they are easily found for less than $1 per bulb. When compared to a $30+ 1000 lumen 10-13 watt LED bulb, one would have to be crazy to even consider LED. With the potential advantage (but not always) of LED at those lumen levels, it would take years (with average use) to even come close to breaking even with the difference in power consumption. All of this assuming that they dont fail far before living that long… and thats another pathetic story unto its own.

With the newest high output high-bay and industrial LED lighting solutions, there will be some accumulated cost savings over HID in power consumption. The other, and potentially larger savings, might be realized in their higher reliability (maintenance labor and replacement parts are very expensive and inconvenient).

The efficiency curve with LED is flattening. Lets hope something new and exciting comes along to the consumer market soon. Even better, lets hope that heat sinking is not part of the future equation for high efficiency lighting.

>Since the price of many CFL’s is often heavily subsidized (in the US), they are easily found for less than $1 per bulb. When compared to a $30+ 1000 lumen 10-13 watt LED bulb, one would have to be crazy to even consider LED.

Unless, of course, you actually want spectrum that doesn’t gives you skin cancer and doesn’t washes colors out.

people whinged and whined about CFLs when they first came out and all their disadvantages compared to incan, so having the same happen with LED lighting now that CFLs are an established technology isn’t really that surprising.

Personally I can’t wait for LED lightbulbs to come down in price to what I can afford - no more warm up times (if they ever warm up outside in winter), dodgy CRI and fewer bulbs to replace. Their current price as new tech means the financial pay off doesn’t make much sense on the home scale for most people, but there are plenty of people for whom the other advantages more than make up for that. At an industrial scale, the pay off tends to be very quick, in the 2-3 year range, once you factor in maintenance as well as electricity costs over halogen/sodium/arc.

Is this where the 251lm/w comes from?
There is also a peak at 480mn (blue) that we are not consciously aware of that sets our circadian rhythm that is not reflected in that graph.

Other thoughts about this thread
I would assume CFLs are rated at full output, whatever that comes out to eventually
LED chips from Cree were rated at 25ºC, now many are the more ‘realistic’ 85ºC, but i assume you are correct pyro that finished bulbs are rated at a steady state output.
The 0.85 heat sounds like a rule of thumb, just like assume all energy is converted to heat (great rule to cover any situation the led will face), as output continues to increase at higher currents the rule won’t be applicable anymore (maybe a few years from now)

LEDs will always need heat sinksinking, being IC components they need to be kept from overheating and burning themselves out, but as the efficiency gets better less heatsinking will be required and hopefully eventually, only a little bit of it will be necessary per chip.

CFLs work at about 60-75 lm/w, any led below this is not worth it unless its in a form factor that CFL can’t duplicate (ie GU10)



I wish there was more information and more disclaimers on these two technologies on their packages and manufacturer’s websites.

CFLs burn out way faster when used in an on/off situation, they only have a fixed number of starts which is determined by how they were manufactured. This information is not available for any bulb i have looked into. I often use incandescent in bathrooms and other places where the bulb will be used only a few mins at a time as CFLs will only last a few months. LED replacements would be great if they were not as expensive as they are.

CFLS will last far longer then their rated hours if they are not switched on and off frequently. Any situation that has a light on 24 hours a day will make a CFL last double or more its rated life.

CFLs have a bad reputation because the market was flooded by super cheaply made junk, usually chinese manufactured

Neither CFLs or LEDs should be put in a closed fixture that doesn’t allow heat to escape, LEDs especially should be installed in a manner that allows maximum air flow for cooling, never in a closed fixture, and ideally in a fixture that has air flowing around it in all directions.

CFLs shouldn’t generally be installed outdoors unless in a closed fixture where the temp won’t get too cold

It seems low CRI in LEDS is to improve the lumens/watt, hence if we use XM-L to make bulbs they wouldn’t do well in the marketplace. As efficiency improves I hope to see single LED bulbs that put out over 1000 lumens per fixture under $10 each available at Home Depot.

The Sylvania 10W/500 lumen/PAR16/85 CRI bulbs that I have use 4 XML’s. Their original list price was $229 ea with Amazon selling them for around $105. You can get them now for under $40. There is still nothing else on the market that comes close to them in PAR16.

No, they’re not IC components. They’re diodes.

PPtk

That’s right they are,
But all the waste heat is generated at the junction of the diode, a very tiny area. This is a problem for high power LED’s. Other light technologies do not have to deal with the waste heat in such a concentrated area. An LED is a pinpoint light source AND a pinpoint heat source. Other light technologies inherently produce their light over a MUCH larger area and it’s waste heat also. Except for focusing applications, Flashlights, spotlights or directional lighting, LED’s have another inherent disadvantage. They produce highly directional light, which produce harsh shadows. I REALLY don’t see a future for a few high power LED’s in a fixture to emulate the kind of light we want in our homes or offices. That approach creates too much light and too much heat in too small of an area to be dealt with effectively. The future I see for casual LED lighting in the future, is arrays of relatively low power LED’s spread out over a larger area. This is actually the older tech that is now available and being used by many of the in home and office lighting. Perhaps some day, LED lighting will be able to be “printed” into large sheets, so that the light is evenly spread out over the entire area. Even now it is possible to buy cheap “strips” of led lights from FastTech for a very reasonable price. Here is the Link
About 1200 lumens for $10. Notice that the strip does not need ANY heat sinking. That’s because the light and heat is spread out over the 16 feet length of the strip, 300 led’s. Our own texaspyro used this type of light to great effect here: Replacing Lumiline light bulbs with LEDs

to simplify: for Flashlights, spotlights or directional lighting, high power led’s with massive heat sinking or active cooling. For in home or office, or anywhere where directional lighting is undesirable, large arrays or strips of many low power led’s. I really don’t see anyway around it, the physics will not allow it.

BTW, even older lighting technologies, Incandescent, florescent, etc. that already produce their light over a larger area, still rely on diffusers to try and emulate an even larger source of light. Look around your home or office, Incandescent bulbs are frosted (soft white), ceiling fixtures have frosted or white globes, even 4 foot florescent light fixtures will have diffusers or a reflector to spread the light out even further.

Go outside at noon on a clear sky, sunny day and you will see a sharp shadow of yourself and other objects. On a clear day, the sun is a point source of light in the sky and is focused (because its great distance) and casts sharp shadows. Then go out on an overcast day and you will not see any shadow. The entire hemisphere overhead is the diffused light from the sun, and appears to us as an “infinite plane” light source. An infinite place light source casts NO shadows, and it’s brightness does not vary with distance. THAT is the best source of light for every day living.

Very interesting, i found a datasheet on it, i should have mentioned T6 or U2 chips because of the low CRI, at 50 lumens/watt its less then CFL output, but in those small form factors CFLs won’t quite fit of course

True, but some manufacturers (bridgelux, sharp…) finally realized that and are now producing leds with double dome. Personally I could not believe when i fired up first bridgelux sm4 at 4W and could look directly in emitter without typical consequences. 8)

Led strips ? Sure but with some of these: http://www.sharpleds.com/doubledome.html and with proper “driving” not using resistors…

An often-quoted advantage of LEDs is that the LED’s light doesn’t produce heat, and is cool to the touch. LEDs are cool to the touch because they generally don’t produce heat in the form of infrared (IR) radiation (unless of course, they are IR LEDs)
Only 10% of the energy used by an incandescent bulb is converted to light; the other 90% is lost as heat.