Using IR thermometer on LEDs

Does pointing an IR thermometer at the front side of a powered LED cause big inaccuracies in readings?

I feel like the IR radiation emitted by a 5000 lumen LED might be messing with my IR meter, but maybe I’m wrong.

Can anyone confirm or deny?

I can’t answer your question but I would think it inaccurate just because of the field of vision of the IR sensor. They tend have a conical shaped beam and kinda take an average of the whole area.

https://assets.fluke.com/Video-TEMP/3810401_chipwade_ir_video_640x360.html?\_ga=2.13406014.864955765.1535970536-2014240563.1535970536

Yeah that’s why I usually have the meter close up to the thing I measure, so that the cone is as small as possible.

Link doesn’t work :confused:

I also gave my infrared thermometer a shot in measuring emitter temperature, and after a few tries with high powered torches I finally had to accept something was wrong with it. Seems the die covering phosphor layer actually works at high temperatures. Did you heard about emitters turning blue just before death? That must be the phosphor layer melting down.

Here is a related article:

True or false: High-power LEDs don’t generate IR heat in the forward direction like a filament lamp @ EDN Network

:-)

My CFT90 was showing 100-120C even though the copper was only at 40C, this might just be the standard temp difference between die and MCPCB.
I just expected the copper to be hotter if that was the case, since it’s heatsinked using liquid metal.

I got curious and did a search. Hope this helps - Noncontact and instant detection of phosphor temperature in phosphor-converted white LEDs | Scientific Reports
Figure 13 is interesting

What current are you running the emitter at?

Luminus gives thermal resistance between emitter and back of the copper core board as 0.45 C/W. So, you measured an apparently a delta of 60-80C. If I’m understanding things right. That’d be whats expected from ~150W of heat at the junction.

Further, the datasheet says that at 22.5A, the forward voltage should be 3.5v, which is 78W power consumption. Radiometric flux (including IR?) in those conditions is ~15W, or 20% of power consumption (if I understand the DS correctly).

The junction is going to be the hottest spot. Are you running it at >35A? If not, the IR reading out the front seems, if anything, too high. How are you measuring the back side of the board?

I’m not sure I fully understand the implications of the EDN and Nature articles linked above. So I doubt my thinking in the struck-through section.

It’s running 48 amps so it could definitely be real temperatures.

Lol, at those currents I guess the article I linked is of little to no relevance.

I’m guessing this is for syniosbeam project?

For reference, I could measure above 140°C in front of the aspheric lens of an SK98 modified with degapped pill and razor blade dedomed XHP50A driven at 3+A (3.23A sense resistor setting). In full flood.

Cheers ^:)

You’re measuring the phosphor, not the die, when pointing an ir thermometer at it. The phosphor can be much hotter than the die (see Nichia E21A for example). I think you need a blue LED to measure the die temp.

Thanks :stuck_out_tongue:

Yup.
Minimal lux improvement at 48A so I backed it off to 40.
Very conservative, I know.

Ah ok thanks. Is 100C is a relatively safe temp for the phosphor?

Should be no problem , but cooler is better.

Thanks, yeah I’ll check the temps at 40A now instead of 48.

You can use the change in Vf and brightness during the runtime to determine approximately how hot the LED gets. I wouldn’t worry too much about the Phosphor.

You could also calculate the maximum die temp according to the datasheet and compare your die temp to it. You will probably be under it because of your cooling setup.