High power leds info

Hi guys don't know where else to ask this but though i would pick your brains.. lol

Here goes

If leds are so efficient why do high power leds get so HOT ? and what is creating the heat? I thought that the whole idea of an led was that is used very little energy and gave off NO heat..

Answers on a postcard

Charles

I'll offer a couple of thoughts. LED's are more efficient than incandescent lights, but there is always going to be heat loss. Also, LED's are at peak efficiency at power levels lower than what is needed in a flashlight. So we really overdrive them and the cost of that is less efficiency and a lot more heat.

I've seen a few articles saying how great LED lights for household use are going to be, but I don't know how much more efficient they will be than compact fluorescents, which use a quarter of the energy of regular bulbs.

Well, I did some Googling and found an article that has information about efficiency and lifetime. Here's a summary:

Efficiency
(lms/watt)
Lifetime
(hours)
Incandescent 17 750
Compact Fluroescent 63 8,000
LED 77 50,000

http://sewelldirect.com/articles/led-vs-incandescent-light-bulbs.aspx

The most efficient LEDs available today generate 120-150 lumens per watt. The theoretical maximum is 400 lumens/watt - so even the best of them are dissipating more than half the input energy as heat. And then there are their driver circuits. Many of the cheaper ones are no more than 75% efficient so that's another quarter of the input energy lost as heat. I'd guess that in low powered lights most of the heat is coming from the driver, not the LED.

All of the next bit is cribbed from here

http://www.cree.com/products/pdf/XLampXP-G.pdf

For a concrete example an XP-G R5 driven at its maximum of 1.5A (Where it is much less efficient than at 350mA where the measurements are usually made - close enough to 1 watt power input). It also gets less efficient as temperature rises - the junction temperature (the temperature of the bit of it that actually makes light which is buried under the (insulating) yellow phosphor) for efficiency measurements is usually 25 Centigrade - the efficiency drops to around 70% of that figure at a junction temperature of 150 Centigrade and it probably drops to zero (i.e., dies) by 200.

In most lights the best we an hope for is a junction temperature of less than 110C - 80% of efficiency. The forward voltage - basically the voltage at which it lights up also rises with current. At 1.5A it needs just over 3.5V to light up. So we need 3.5V minimum at 1.5A current. That gives us a power input of (Power = voltage x current) 3.5x1.5=5.25W for the LED. Now add in driver losses - we'll be exceptionally generous and assume the driver is 85% efficient. So now to get the 5.25W at the LED we need to put in 100/85 x 5.25W = 6.18W. Now if the LED were 100% efficient we've already got the heat from the driver - 0.93W to dissipate.

We'll assume a solid aluminium light - which is obviously an approximation.

Now my Trustfire R5-A3 is a typical aluminium AA light. It weighs 55g. The specific heat capacity of aluminium is 0.91 kJ per kilogram Kelvin degree. 1 Kelvin degree is essentially the same as 1 Centigrade degree.

So my 55 gram light (I'm ignoring the cell as it is not in good thermal contact with the rest of the light which is a very good thing with lithium cells) begins to warm up.

To heat up my 0.055 kilogram light by 1 degree centigrade requires 0.055kg x 910J=50.05 joules of energy. One watt is one joule per second so I have 0.93 joules per second going into the body from the driver alone. OK, the heat losses by conduction (Your hand is a heatsink), convection and (minimally) radiation go up as the fourth power of the temperature difference but we are in physics experiment land here and am assuming no losses to the environment. So the losses from the driver are going to heat the light up by about a degree per minute. No big deal there.

But we turn to the LED. We have a bit over 5.25 watts going into it. Out of it we are getting about 3.25x as much light as we would at 350mA. However, to get it, we are putting in 4.3x as much energy so it is 3.25/4.3 = 76% as efficient at best. So a quarter of that 5.25W is now appearing as heat which is another 1.26W. We are now losing more than two watts which is appearing as heat. 1.26W from the drop in efficiency of the LED at higher currents and 0.93W from the driver. Now we are putting in 2.19 joules per second into the light. It is now rising in temperature by about 2 centigrade per minute. This paragraph above is a bit of a red herring. Feel free to ignore it. I can, and do get carried away in a quest for completeness sometimes.

However, the above assumes 400 lumens per watt from the LED at 350mA. We're nowhere even close. Cree claim 132 lumens per watt at 350mA (http://www.cree.com/press/press_detail.asp?i=1241094842732) and a junction temperature of 25C. Our junction temperature is going to be around 100C which we'll call 80% efficiency so we are now getting 106 lumens/watt but at the higher current we are less efficient still - Cree say 345 lumens at 1.5A and a junction temperature of 25C (actually it is worse than this as the junction is going to be very hot) which comes to 65 lumens per watt.

But we'll take 65 lumens/watt. With a maximum of 400 lumens/watt the LED is 65/400 17% efficient. 83% of the energy going in to the LED is coming out again as heat. So the 5.25W going to the LED (probably nearer 7W from the battery), at best, 4.36W is heat. Add the 0.93W from the driver and we are now losing 5.29W or 5.29 joules/sec. This is about 6 centigrade per minute. That is going to burn you pretty quickly and why AAA lights tend not to drive their LEDs hard -= they get far too hot, far too fast.

If this isn't clear I'll attempt to explain it better.

According to wikipedia a 5800K "ideal white source" has a maximum theoretical luminous efficacy of 251lm/W. Current technology white LEDs are not ideal white light sources so they should be able to reach a bit higher efficacy than that. I've previously seen the figure of 283lm/W max for white LEDs and I suspect that the maximum efficacy for current gen LEDs should be around 300lm/W. At maximum theoretical luminous efficacy all energy that goes into the LED gets converted into light and no heat is generated.

Assuming that 300lm/W is the maximum figure for LEDs:

An XPG R5 @ 1W emits ~150lm. Compare that to an *ideal* LED 300lm @ 1W and you get 50% of maximum theoretical efficacy. At 50% efficacy, 1W of electricity put into a LED generates 0.5W light and 0.5W heat.

At 5W that XPG R5 emits around 500lm, which would give us a rough figure of around 100lm/W or 33% of maximum theoretical efficacy. If you put in 5W of electrical energy into a single XPG R5 you get 1.7W light and 3.3W heat.

If we were to have an array of five XPG R5s, each driven at 1W for a total of 5W, we we would still have that 50% efficacy and should get 2.5W light and 2.5W heat out of that 5W of electricity.