How close is temp sensor to emitter temperature?

I never see this talked about.

Has anyone tested how close the reported temperature is to the actual emitter temperature?

For example in Emisar or Noctigon lights.

I wonder how much of a discrepancy there is between the two.

The temp sensors are built into the driver, so it’s dependent on the driver temperature, not the actual temperature of the MCPCB. If the sensor measured the actual LED, it would begin stepping down very quickly before the host became heat saturated. There are some drivers that have an external temperature sensor, but they are custom deals.

In real world applications I’m sure it makes no difference at all. Pretty sure most, if not all torches with thermal programming can be overridden anyway?

With current Anduril thermal management.

Which leads me to wonder just how large a discrepancy there is between the emitter temp and the temp-sensor temp.

Led4power used to have drivers and MCPCBs set up with sensors on the mcpcb

The thing is, if we could sense the MCPCB temp, we’d set the thermal limit higher. It can take a lot more heat. Ultimately the goal would still be “light is warm but doesn’t burn my hand” or something

“In modern LEDs, it is standard that junction ”temperatures reach up to 100°C and beyond.“:LED Heat Dissipation: An Optimization Guide | SimScale

Well put. :student:

There is a beneficial temperature discrepancy difference,
between the LED and the operators hand. :+1:

So it appears the datasheets aren’t referencing actual emitter temp when listing non-junction temp specs.

That makes sense, as they often list junction temp max at 150c.

LEDs (and power components like diodes, FETs) can also take a lot of heat before failing, as mentioned often into the 210+ F range. They have to be robust to survive reflowing.

If a driver could be developed with an ext thermocouple to measure the mcpcb, it would indeed allow a higher thermal ceiling. I’ve tested lights where they run up to 105 C and survive (Wuben A1 and Fenix LR80R), but obviously not good for them since the LEDs are actually much hotter (one sst70 did start failing on the Fenix).

Electronics and led can easily operate at 80C temp, but your skin and lithium batts 45C max. Anyway high temperature is a source of electronic component degradation.

Our skin doesn’t like much more than 50 C. I have a high pain tolerance so 55 C is do-able. Electronics and LEDs don’t care too much though. Electeonic device designs incorporate active cooling or big passive heatsinks for components subjected to high temperatures for reliability. Some li-ion batteries can have a 80 C thermal limit, but limited to very short periods.

Most drivers with thermal regulation (vs timed) have the sensor in the µC itself, so there’s an assload of lag between the LED cooking itself to death vs the temperature of the driver itself.

There are some mcpcbs that have the thermistor right there, to at least measure that temperature, but they’re few and far-between.

So stepdown is really to keep the light from becoming too hot to hold, not to keep the LED from cooking itself. So some lights where using thicker wires, a beefier FET, and low-R cell, can make the LED go blue from cooking itself (seconds) while the driver will only gradually heat up (minutes) before tripping thermal step-down.

I’ve always wondered how much this benefits Zebralight since the MCU temp sensor is directly beside the LED on the same PCB. Their thermal regulation is the best in the industry by far. You can pick up a light and instantly see the output start adjusting just from the heatsink effect of your hand. I assume it’s a combination of good coding and optimizing the location of the temp sensor.

Faster response time, less thermal and temporal lag, etc., makes it actually easier to program for.

Add delays, etc., and it starts to oscillate. That’s why a lot of lights have these wild swings in brightness. They won’t throttle down until practically cooking, then dim to almost nothing ’til they cool sufficiently, then bright, dim, bright, dim, etc.

As I pointed out before, imagine a heater with thermostat is set to heat up the room. So it goes full blast, cuts down, full blast, cuts down, etc., and keeps the room a relatively temperature. That’s the system used for “ATR”.

Problem is, the LED is the heater, and the comfy room is the driver with sensor. So the LED pretty much goes on/off/on/off/… and the light you hold on to stays a nice comfy maximum temp.

I wonder how close we are to junction max after a couple shots of turbo and the light is difficult to hold. With, for example, a hot-running SST-20 D4V2.