Nitecore EA8 tear down - Modding will be later and in a different thread.

Thanks for teardown OL, makes me glad I didn’t buy an EA8.

Still digging the EA4W though.

Linh

I think what is needed and what is optimal are different things. Technically yes if the contact area is lower than that of the LED I would think it couldn’t be classed as a bottleneck. However, from a lay persons viewpoint I would expect that efficacy of heat transfer is a combination of heat differential and the surface contact.

Normally the highest differential will be at the LEDs contact to the base plate, and that is also the smallest area. If you always from that point outwards have a higher contact area at each thermal connection then even a lower differential can be wicked away. If you get to a lower surface area that is the same as or minimally more than the LEDs contact area to the first heatsink area I think it gets more complicated.

As the heat differential at start-up will be greater between the LED and first heatsink than the heatsink and the body then the heat would transfer faster from into the first heatsink than than away from it, if the surface area at both connections is exactly equal then I would expect the primary heatsink to continue heating up until the heat differential between the heatsink and body is as great as the LED and primary heatsink, then you would get an equilibrium where heat would go out of the heatsink as fast as coming in.

Of course if you increase the contact area of the heatsink to the body then this equilibrium will be reached at a lower and lower heat differential, keeping the heatsink itself cooler and I suppose by that logic allowing the LED to shed heat faster as well due to that making the first thermal junction heat differential greater.

Just some thoughts, I can’t guarantee their accuracy but they seem to make sense in my head…

Either i misunderstand the laws of physics and tests on my own light, or there is huge lumen sag on this light or its engineered to be barely enough heatsinking or extra current added to make up for it

IMO its simply poor design, as we all know the hotter the LED gets, (or the slower heat is removed) the more the lumen output will sag, i think putting the light through some real thorough testing would answer many of our questions, but that would require an excellent IS, thermal analysis, precision power supply equipment and a lot of time, and preferably someone who can also decode the electronics

So now I wonder what the P25 looks like inside?

Output/Runtime graphs show turbo output to be perfectly flat for the duration of battery life (90 minutes with eneloops).

if that is the case then there are questions to be answered as to how

Is this a trick question? The light is able to adequately dissipate heat just like any other light. Heat goes from the LED through the heat sink to the body of the light where it is then transferred into the air… or hand… or water… or cheese… or whatever happens to be in contact with the light.

as the heatsink gets hotter the light output should fall, and as it looks heat dissipation is rather low so the heatsink may be getting to close to 3 digit temperatures, so how does the light keep the same brightness?

Because looks can be deceiving? The original post has been modified to say that the heatsink does indeed come into contact with the body of the light. Heat dissipation is adequate. Just because you think it “looks rather low” doesn’t make it so.

true enough, but seeing data from a heatsink that has far more thermal contact and about 15% lumen sag in a few mins makes me very skeptical about the performance of this setup

btw, cheese has been done, it didn’t fare very well, though having worked with it a fair bit i assume it has good emissivity

Was that sag measured at around the same current? As well as definitely not due to voltage drop of the cells?

Maybe the star used by Nitecore has the dialectric layers removed from the centre spot on the LED connection… Just speculation but that should improve heat transfer from the LED itself.

freshly charged battery, i forget which one
i would be interested in some testing with sinkpad replacements vs regular stars in regard to lumen sag in flashlights, i suspect the results won’t be as good as match’s tests, plus heat shedding from the heatsink won’t be greatly increased (but should slightly) but speculation without actual data doesn’t make for verifiable facts

I can’t comment on the setup you’re speaking of (having not seen it myself). However, at least two independent reviewers have posted runtime graphs of this light. Neither showed any lumen sag whatsoever. I’m guessing the engineers spent time optimizing the amount of materials needed to match with how hard the LED was being driven.

As for cheese, I find it that too much of it can have a negative effect on my thermal pathways.

thats why i think the explanation lies in increased current over time

Voltage sag under load happens from any charge state, the key thing would be whether the circuit used can compensate (or if linear burns off excess above LED vf anyway). I think quite often output drop due to voltage sag and thus a drop in current to the LED is misinterpreted as being due to heat.

Increased current over time makes no sense. It would have to be a pre-set current increase based on measured output sag due to heat. But heat can depend on external factors such as external temperature, so a pre-set programme like that wouldn’t be able to maintain a flat curve anyway.

any xm-l led will experience lumen sag, thats why the datasheets are binned at a set temp, and 85ºC is less lumens then 25ºC for the same chip (XM-L2 shows both), and there is also a graph for lumen reduction at given temps.
Also for the 25ºC values i believe cree only ran the LED for a small fraction of a second to get an accurate number for that temperature, and then provided us those values
how they are doing it is yet unexplained, but i doubt its due to excellent heatsinking (or even adequate heatsinking), but again speculation isn’t fact

I know but don’t see how that’s relevant tbh.

What I mean is that the temperature of the light body will be different in different conditions, so the lumen sag will also be different. If Nitecore wanted to counter-act that by increasing current they would have to have pre-set a programme, so it wouldn’t give flat regulation in all conditions. To assume selfbuilts runtime test happened to be conducted in the exact conditions to balance it out again for perfectly level output just seems too unlikely.

It also seems to be something way OTT for Nitecore to bother doing. Most people wouldn’t notice even a lot of heat-sag in real use so why bother accounting for it anyway just for our benefit.

thars my point, unanswered questions

Ha ha, sorry but I think you’re definitely reading too much into it. That last question was a rhetorical one, meant to go unanswered. I think we would have to say that heatsinking is enough on the light, not ideal, not great, probably not even good, but sufficient. That is by far the most reasonable conclusion imo.

Increasing the current will not offset the effect of thermal droop, infact, it will make it worse.

The Selfbuilt results demonstrate that the small interference fit of the heatsink with the light body does provide an adequate thermal path (amazingly), hence the steady output until battery discharge. What is not known is how well this performs at higher ambient temps.