Improving thermal heat transfer of flashlights?

Agree. Except for extremely cold weather there is probably never a good reason to transfer heat to the battery/compartment. That would apply to flashlights or vehicles or any batteries. And testing bike lights indoors with no air flow is not realistic.

This is a long-known fact that unfortunately most light makers still ignore. :frowning:
If you want the lowest thermal resistance, unibody is the thing. If you want just right thermal resistance to have good thermal transfer yet avoid burning you fingers - add thermal bottlenecks where it makes sense. Having ultra-short thermal path to the switch and a thermal block on battery tube is a poor choice…

As to putting thermal compound on the threads - This has been discussed before as well but I think I haven’t seen anyone doing this yet.
The reason is that these compounds are abrasive and will quickly eat your threads. So you should be careful to always unscrew from the tail and don’t even move the other thread unless for occasional maintenance. Is it worth it? Maybe.

yes, your test shows clearly that Unibody moves heat to the battery compartment more.

are all 3 lights set to the same Lumens?
do all 3 lights have the same mass?

imo, the reason some lights do not overheat, is because they have low lumens to weight ratio…

a 250 lumen Malkoff or HDS, will not have the heat problems of a 2500 lumen Anduril light.

Obviously the best way to reduce heat is to reduce Lumens, and increase the weight of the Host.

It’s not so simple. Weight gives 2 things:

  • specific heat capacity
  • (usually, depending on host design) beefier thermal paths, so lower thermal resistance

Specific heat capacity is a thermal buffer. The more you have it, the slower temperature rise at any given heat production. But this only helps make Turbo last longer. Over a long time, even a slow rise will end up at the same maximum temperature.

Lower thermal resistance, if present has a couple of effects:

  • it may make the surface temperature more uniform; cooling is very roughly linear with host temperature. Average temperature, not maximum. If you have a fixed power output, a host with more uniform surface temperature will have a lower temperature of its hottest point. In other words - will be cooler.
  • this is a marginal effect but lower thermal resistance means that LED runs cooler. And cooler LEDs are more efficient, so at the same output level they produce less heat (I’m not sure but I think that at the same current level a cooler LED produces more light AND more heat).

And, BTW, weight matters only if you compare 2 hosts made from the same material. There are huge differences not only f.e. between alu and copper but even between aluminium alloys.

More important is heat dissipating surface area not the mass.

We use to cut fins in 2D maglites to help improve the heat dissipation. Aussie-Yank Flashlights Inc. Or how to span the oceans with crazy mods and good BLF friends. All Done
We were loosing mass but creating fins, which increases surface area to shed heat. Most all plain aluminum heat sinks use the same thing. I have always read that copper absorbs heat better and aluminum sheds the heat better. CPU heatsink designers used both materials in several high performance CPU heatsinks. Since they spend lots of money and R&D to try to win the lowest running heatsink award. I assumed, They have done the reserch and know what works.

More surface contacting with cool air … simple :slight_smile:

You wouldn’t need to use thermal paste on the threads, any regular silicone grease or whatever would work just fine to improve the thermal transfer.

That’s not true. Copper has much better thermal conductivity and higher volumetric heat capacity, so if you compare the same part made in both materials, copper wins on both counts. If you make the part larger, so it has higher volume and the same weight, it will be the opposite though. For CPU heatsinks, I guess that weight is a limitation, so they use copper close to the CPU where it matters the most and alu elsewhere.

ADDED:
Some old data that I had available easily:

Regular alu 6061 has thermal conductivity of 150 K/W and heat storage of 2.42 J/mm³K. 6063 has 200 K/W and nearly the same 2.43 J/mm³K. Copper has 400 K/W with 3.44 J/mm³K. 7068 which I think is the overall best flashlight alu for being much tougher than 7075 with great thermals – has 190 K/W with 3 J/mm³K.

ADDED2:
Well, the “copper wins on both counts” is not so simple. What are we counting for capacity? Comparing pure numbers - it’s what I wrote above. But what really matters in our context is time before the surface temperature reaches 60 °C. Which is different.
I don’t understand the physics of heating-up too well but what I understand is that there will be temperature gradient - the closer to the LED, the higher temperature. The gradient will be steeper with alu because of its lower conductivity. The LED will be hotter and the material close to the LED will be hotter as well. If we had 2 materials with the same volumetric heat capacity and different heat conductivity - the one with lower conductivity would overheat slower because the hotter material would absorb more heat. But here alu has both lower conductivity and volumetric heat capacity. So which one would overheat faster? I don’t know.

I went through some of this stuff back when I was into overclocking…

When comparing copper and aluminum heatsinks you have to ask whether you are considering heatsinks of equal volume or equal mass. If equal volume then the copper heatsink wins handlily but weighs over 3 times as much. If comparing heatsinks of the same mass the Aluminum heatsink will be over 3 times the size of the copper and can have a much greater surface area. In that situation the copper heatsink will get hotter and initially pull more heat out of the heat source, but once it heats up the aluminum heatsink may be more effective at dumping heat to the environment and allow for a lower temperature of the heat source in equilibrium.

All of this depends on specifics of geometry, thermal contact and radiator design, etc. Ultimately to maximize heat transfer you want the outside of your heatsink to get as hot as possible, have as much surface area as possible and have it be in contact with as cold an environment as possible (generally flowing air or coolant).

For a flashlight, though, we may not care about equilibrium temperature, but just keeping the light head cool for a minute or so. Under these circumstances a heatsink with higher thermal conductivity and more thermal mass (i.e. a big old chunk of copper) probably wins (but may get uncomfortable to hold pretty quickly).

It is important to make sure the heat coming into the heatsink also exits it. This can happen via conduction to your hand, convection and radiation.

Emissivity is the measure of how fast thermal energy radiates from a material. Higher emissivity is better as it means heat can get out of the heatsink to atmosphere faster.

My recollection is dark colored anodized aluminum has much better emissivity than polished copper or bare aluminum and slightly better than tarnished copper.

I wanted to share how heat is transferred from head to body and see the differences. Bike lights are just for compare to unibody design. If I want to simulate outdoor condition of bike light I can cool them with fan. But here I wanted to see how heat is spread over body of light. Also you can tell for lights with cooling when they test in reviews are also unrealistic.

Fireflight, your recollection is about right. I’m curious what the breakdown of radiative vs convective cooling occurs in our typical applications. Anodized aluminum, regardless of color, is pretty darn good when it comes to emissivity.

The best way to study heat removal is to look at computer CPUs. You can go convection, convection with a fan, or some combination of liquid usage.

The same applies to gas engines. You have 2 cycle air cooled, 4 cycle water cooled. The VW Bug was an air cooled engine.

Flashlights are no different except much smaller form factor.

The thermodynamic principles are basic laws of physics.

While everything varies with geometry I saw some old data on heatsinks for electronics showing that in most cases radiation accounted for under 10% of cooling. When forced air was used that went down to ‘well’ under 10%. So I’m not sure how much emissivity really matters here.

Interesting.

If that’s the case, then the main way heat will exit a light is convection. So you need lots of surface area and airflow over it.

I recall doing some quick and rough calculations of that years ago…with conclusion that for small flashlights (and small for me is DQG Tiny 18650 and smaller), hand cooling > wind cooling > radiation. And I say wind cooling rather than convection because even small hand movements generate more wind than convection. So improving surface emissivity might help some but it’s not really a big deal.

I’ve found hand cooling only works up to a point.

My modded DQG 18650 Tiny IV with Anduril very quickly becomes a hot potato if I try to run it on turbo too long.

:heart_eyes:

Yesterday I tried to put cheap silicone thermal paste on threads and its works. Temperature difference between tube and head is lower maybe 5°C or more. Need to test it additional, also thermal convection seems to be better. About emissivity black anodised aluminium will be best overall. Raw alumium in my case of WT3M is worse compared to black body. Now I have two WT3M in raw but second one is with SST40. So if I compare it with another one which is with XHP50.2 I didn’t know if it will be fare comparsion at all for test of cooling. Also yesterday I repair my 50.2 version because there was leak in one 7135 chips and I removed it. I dont have spare chips. By the way I found it very easy with thermal camera which is defective. But there is also problem with charge circuit and it didn’t work anymore but I don’t need it.
Also I am waiting to receive my 3rd WT3M with XHP50.2 and black anodisation. It will be interesting to compare it to another 50.2 in raw with thermal paste and raw body.