Astrolux S43S NW Test Results

Yes, that’s correct.

I had to add some music to cover a background conversation. Sorry. There’s no dialog, so you can turn the sound off if you want.

Video on how to change thermal step down.

I've been puzzled by these references in some reviews to the sudden fierce thermal regulation step down from turbo of the S43 and S43S. My S43S as out of the box unaltered (ordered early November, arrived about a week ago) has always stepped down in a sequence of steps like those you cite after lowering the thermal regulation temperature. The first step down from turbo varies quite a bit depending on the state of charge of the battery and which battery. I've seen a 43 sec minimum and a 110 sec maximum. After that the subsequent steps down always take place at regular intervals of around 23 seconds. It's the same behaviour whether I use a 30Q 18650 or an 18350, except that a fully charged 18350 can't reach the same initial turbo brightness or short time as the 30Q.

I’m sure Tom E could explain the NarsilM thermal stepdown behavior in detail.

Keep in mind that the thermal sensor for the driver is built into a chip on the driver itself, the Atmel ATtiny MCU, so there is a certain delay and inaccuracy associated with it. Still, it works pretty well on lights that heat up a bit slower. On these small lights that heat up super fast it can sometimes cause the software to act funny.

ToyKeeper has tweaked the Anduril UI, which is going to be used on the forthcoming FW3A, to be more of a thermal control as opposed to a thermal stepdown. The FW3A also gets hot very quick, but once it cools it can actually go back up in brightness. It still uses the ATtiny chip which is what most of the BLF user interfaces are based on.

I have some notes on NarsilM v1.0 and it says the mcu samples the temperature after the initial stepdown every 45 seconds in order to give the temperature a bit of time to stabilize. It seems newer versions have cut this time in half. Probably every 22.5 seconds, which is why we see it adjust in 23 second increments.

Hhhmm, guess I should know, but feels like forever ago - I should revisit the code, refresh my ol memory chips - they've been losing bits every day...

That’s my experience, too. My only complaints are that

1. It steps down too far. As mentioned, you end up with less than 100 lumens by the time its done.

2. It never steps back up after it cools off.

Okay, I get that (2) is probably not going to be implemented in a budget light, so I’m okay with that. But (1) should really not happen. Perhaps the firmware should have a “pause” limit of something like 300 lumens, and stay there for a few minutes before it tries further step-downs.

OTOH, last night I was out walking in –16C temperatures, and used turbo for several minutes without any step-downs at all. So, the temperature sensor works okay. Unfortunately, turbo produces way too much heat for any decent run-times at room temperature. But outside in winter, it’s much more useful.

If it’s doing one giant stepdown, try what I mentioned earlier. Reset the stepdown temperature a little lower. After doing this I got several smaller stepdowns instead of one big one…

I believe the head heats up quicker than the thermal sensor heats up. So by the time the thermal sensor gets to 55°C (IIRC), the head is already well beyond this. Resetting the sensor based on how hot the head feels seems to work better than the factory setting.

problem is the MCU firmware can't detect lumens output - all it can control is relative output and it doesn't know even what amps is going out, yet alone lumens, and the firmware, least how I wrote it, is for generic lights, not specific LED, host, and battery configurations.

Also the temp in the MCU is disconnected from the temp at the LED(s) and temp you are feeling on the body. It's a very imperfect system. External temp sensors at the proper places would help for sure, but still the algorithms aren't trivial.

However Zebralight does their thermal control, works extremely well. I’m guessing they use some integrated approach (and maybe expensive) that isn’t practical in this light.

As a possible partial-solution, could the firmware you wrote not be tweaked for whatever host is using it? For example, in this light, it could start at a lower temperature, and each time it steps down it could raise the temperature threshold a bit. That might slow down the step-downs, and allow the light to settle down at a higher output.

Zebralight designs their firmware for their individual flashlight models - we/I don't, plus they have regulated output so temps are predictable,can be modeled - ours are not, we use FET designs in extreme ranges of power/heat and various small/large hosts. I don't know how you can compare - it's totally different. I would love to have just one platform with regulated output and decent temp sensors to write firmware for - it would be so much easier.

The Attiny85 built-in temp sensor is just not reliable. Here's a table of values I've used for recent offset calibrations in NarsilM for the ATtiny85:

// Temperature Calibration Offset - value is added to reading, higher the #, higher the reading:
#define TEMP_CAL_OFFSET (4)
// 4 about right for the C20C#2 (triple)
// -11 about right for the C20C, kludgy driver (think MtnE modded)
// -18 adjusted for the X7R DEL 17mm driver, piggybacked
// -12 rough guess for the C8F 21700 triple
// -12 rough guess for the X6R triple
// -14 about for the TA driver for the M8
// -3 try for SP03
// 1 about right for the C8F #1
// -12 guess for the JM70 #2
// -19 is about right for the Lumintop SD Mini, IOS proto driver
// -3 Decided to use this for Q8 production
// -6 BLF Q8 Round 3 - blinks 29C w/3 setting for 20C (68F) room temp
// -2 try for the Manker U21 (LJ)
// -2 works for the Warsun X60 (robo) using the 17 mm DEL driver
// -1 try this for proto #1, OSHPark BLF Q8 driver
// 3 about right for BLF Q8 proto #2 and #3, reads ~20 for ~68F (18C)
// -12 this is about right on the DEL DDm-L4 board in the UranusFire C818 light
// -11 On the TA22 board in SupFire M2-Z, it's bout 11-12C too high,reads 35C at room temp, 23C=73.4F
// -8 For the Manker U11 - at -11, reads 18C at 71F room temp (22C)
// -2 For the Lumintop SD26 - at -2, reading a solid 19C-20C (66.2F-68F for 67F room temp)

So the range from above is from -19 to 4, a 23C swing just in offset - who knows how badly calibrated it is in gain, then how sensitive it is to change, etc. It's really an awful sensor to use for anything much. Dr Jones spent weeks/months on developing a PID based algorithm for thermal regulation and I still hear there's problems with it.

so its a hotrod like the d4 that pulls like 20A on turbo, no big suprise here jasonww :slight_smile: I kinda expected it to get hot very fast on highest.

Very interesting!

Now I understand much better what is going on. And of course the software has to be generic, to still do sensible things with different kinds of heat sinking, different kinds of battery technology, etc.. The ideal solution is PID control. Before I retired I used to be a research roboticist and am familiar with how well PID systems work in ideal situations, and in what ways they misbehave when things are less ideal than the textbooks presume. They do need to be carefully parameterised for the specific thing they're controlling, and to be conservative enough in design that product variability in a specific model of flashlight, such as how much heat sink compound the assembler applied, and how well, don't cause the control to misbehave. PID systems can misbehave very badly when circumstances step outside the design assumptions.

A really big problem in these little "hot rod" lights which can heat up to dangerous levels in less than a minute is that there's enough thermal time lag between the temperatures of the sensitive bits, the temperatures of the heat producing bits, and the temperature of the sensor, that by the time the sensor is detecting dangerous heat levels, even if it switched the thing completely off, there could still be enough heat heading its way from the hottest bits towards the heat sensitive bits, that several seconds after switching the power off things could still heat up enough to cause damage. Even if the parameters were carefully tweaked for a particular model, such as the S43S, and worked very well, it could still happen a year later that a new even higher drain battery could come up which would cause annoying misbehaviour at the least, and terminal damage at worst.

And of course the variation between the technologies of different manufacturers will be much more extreme than within-product manufacturing variations. The problems generic software must face in thermally regulating these little monsters are very extreme indeed!

I'd be tempted to try to build some self-calibration into the software, some way of estimating how much thermal lag and thermal inertia there was in the system so that it could adjust itself on the run to different models, new kinds of battery, and of course different user selectable power levels. For example, how long does it take after the power has been applied for the sensor to detect a 10 degree rise in temperature, a 20 degree rise, and a 30 degree rise? Some useful information which could lead to better parameterisation of the control system might be able to be derived from that.

It would be very time consuming and very tricky to try to develop that kind of adaptive software by trial and error in real flashlight systems in the lab. I'd be inclined to try to develop it first in simulated systems which were designed to have more extreme variability than ever could be met with in practice, and then try it out in practice. It would not be surprising if that process had to be iterated a few times to shake out the bugs in the simulation and in the assumptions behind the simulation design.

In terms of software development difficulty I'd say this was around the level of an interesting PhD project!

Ahh, ok. Yes, I haven't done PIDs myself but have worked extensively with lead/lag/gains starting 1996 to the present day - development/support of a multi-joint rehab system. Though I'm weak in the theory I have lots of practical experience - basically it's a motor control feedback control loop, PID's and L/L/G's are just 2 different ways of accomplishing it, but the PID is preferred nowadays.

Dang, you were in robotics research? I'm working on a similar (rehab) new project with the robotics team at NASA, plus we have another couple projects leveraging actuators -- it's getting big into some of the more challenging patient treatment options, though costly of course, so it's still more popular in research w/universities or specialized clinics.

Agree - a simulated system would be great - think Dr Jones did it this way, plus he developed it on a computer so it was easy to spin changes, getting instant feedback, etc. Only when you got it working well in a test/sim setup, then port it down. After that of course, will prove how good and real world your test/sim setup really was.

It would be nice if Atmel came out with a new MCU that could still run existing software, but had a way to use an external temp probe. That would solve so many problems. Get rid of the lag, much better temperature control.

You could add an external temp sensor on a 85, just that it will take an I/O pin. Currently on the FET+1's using internal voltage divider, there's a spare pin even with a switch LED. Of course I'm say'n this like I've done it, but actually I'm not familiar with what temp sensors are out there and exactly how they work. I thought though there are ones more reliable that don't need calibration. Thought Richard had it working, Led4power for sure, dunno how well they work though.

led4power charges €0.20 (23 cents) so pretty cheap.

I finally tried to measure intensity of a S43S,which I have some months before,and took only 4500cd,measured at 7.6m.The light however seems to be very strong,my eyes hurt.
The meter was a cheapy HS1010,the one which is two pieces with a cable between.I tried a second time with a Dr.Meter luxmeter,but I found about the same,ie 4600cd.
I think it is too low.Any suggestion?

How are you doing the conversion from lux to candela? should be:

candela = lux * 7.6 * 7.6

Also according to the NEMA/FL-1 standard, you should used only the max reading, highest # you encounter - no averaging, no typical, but the max. Should be after 30 secs but I know I cheat a bit on that rule - I don't time it, but takes me a while of hunting to find the highest reading.

I go to this site, Lux to candela (cd) conversion calculator, put the data and voila.Also,I press the max button.

K, did a simple test on that site and #'s look good. Dunno - forgot what I got for throw, don't have the #'s offhand. Not sure what others measured offhand either. Could you be off by a factor of 10? I usually used my meter in 10's, or 100's for my big (mostly custom) throwers, but that's at 5 meters.

So you measured about 78 lux at 7.6 meters and got 4500cd or 134 meters of throw?
What led and battery are you using?

Banggood says the XP-G3 version does 184 meters. Nichia will be less, of course.

I never measured the throw on my nichia version. I consider it a short range light.

I checked my numbers. I have a S43S with 219C's. After my tweaks I measured 8.75 kcd on a 30Q cell. I noted I got a 7.1% bump in lumens from stock with the tweaks, so the bump in kcd should be about the same. Tweaks were sanding/grinding down the bezel, better springs - not sure what else.