H17F - programmable driver with full thermal regulation

A bit more than expected but this is a premium driver. Much cheaper than a Prometheus Icarus driver and likely better anyways. Gives me a reason to order a few more things from mtn electronics.

Cool.

Are there any estimates on how low the lowest few modes are?

I’m going to guess it will depend on the emitter quite a bit. I’m using it in an XP-L triple.

I posted a picture above of the lowest mode compared to 1 lumen in the S30 Baton.

It visually looks like about half the light compared to the lowest mode of my Wizard Pro (warm), which is 0.2 lumens, according to the manual.

Hope that helps!

The lowest few modes are very low. A very nice moonlight, for sure.

I am surprised you are seeing such a small difference in DD output. In my triple XP-L and Nichia 219C lights the difference in output amps is much bigger than what PrinceValorum is seeing. That said, a big jump in amps on the high-end does not equate to a big increase in visual output, so in most cases it isn't a huge deal.

The temperature sensor is very smooth, but it will let the small triples overshoot the set point by a significant margin; in cases with large temperature swings the driver is too smooth and does not ramp down fast enough. It still will not badly overheat, but it overshoots more than I'd like to see happen. An algorithm adjustment that allows for bigger steps when the temperature increases past the set point by a significant margin or at a significant rate would certainly help alleviate the problem; i.e., smoother at normal temperature increases, but bigger steps with rapid swings.

Thanks for the additional info, everyone. I looked up some MOSFET resistances and it looks like the one used in this driver has 25 to 30 mOhms on-resistance. I think the dedicated FET drivers have less than 10 mOhms? An additional 20 mOhms is like having an additional tailcap in the circuit. Not too bad but enough to cause a significant drop in current, as people seem to experience. I think I will use a FET driver for my triple/quad just because I want it to be a little lumen monster. Looks like I have good excuse to build a nice C8 with the H17F :sunglasses: .

The double-tap feature is one thing I really like about the FW. How quick do the taps need to be to access this mode? Is it easy to accidentally access this mode when switching modes quickly?

On a slightly related note: I was looking at Dr. Jones different FWs and I saw that the guppydrv universal clicky I have on my QLITE 8*7135 has battery voltage indicator. This feature is not described on mountain electronic’s page. It seems to work like lucidrv, where 8 quick clicks brings you to the programming menu, where it will blink for each 0.1V above 3.0V.
Richard, you might want to add this to the description for this FW on your page; I think it is a pretty attractive feature and might be an important point for some people.

The MOSFET on the H17F is between 25-29 mOhms when operated between 4.5V and 2.5V gate voltages. We're never going to see 4.5V, or even 4V under load (remember the drop from the diode + battery sag under load), and that value is when the FET is cool. I suspect that in real-world use we'll typically see 29+ mOhms from this FET. Still good enough to provide a big boost in output, but not nearly as good as the big MOSFETs. The MOSFET I am using on the new MTN-17DDm is rated at 2.13 mOhms at 4.5V gate drive, and looking at the graph in the data sheet, an estimated 3.75 mOhms at 3V. In other words, an order of magnitude less resistance, but it is also much bigger, more expensive, and is harder to drive (more gate capacitance).

So, it's a tradeoff. Like I said before, the difference between an 8A and 10A triple may sound staggering, but visually the difference isn't as big as it would seem, so really it depends on what you value most.

I think the average tail cap has a bit more resistance than 20mOhms. I don’t think you would be disappointed if you use the H17. It is very responsive to the user input and is an advanced but practical driver. Even for use on the smaller lights. On these lights the lumens output drastically drops with high temperature. So by throttling back the current, you actually saving lumens and you will be able to sustain those lumens longer. more heat, less lumens, more battery used. I think this is the draw of a driver like this. I know its not that simple, but I think you will be pleased with the performance of this driver.

anything I say can’t and won’t be used against me when compared to proven fact…

I am looking forward to using this driver in my S2+ with XP-L HI, and possibly a C8. But I’m planning a triple or quad XP-L with single 26650, and I think I want to optimize this one for maximum power. Maybe there won’t be a huge practical visual difference between using this driver and the 17DDm, but what can I say, sometimes I’m not practical. :smiley:

I measured my S2+ tailcap by sending current through it and measuring the voltage drop, and it was 50mOhm including stock spring, and 20mOhm bypassing the spring altogether.

I just installed this driver in my S2+ with XP-L HI V2 3D. I love the extensive customizability.

Temperature setpoint is 60C. In high mode (2.9A using all the 7135 chips), the current starts to drop within 20 seconds and is down to 1.5A by around the 1 minute mark.

In turbo mode, I’m getting around 4.5A with a pretty fresh LG D1 cell at around 4.26V open circuit. In turbo mode, the thermal regulation is much less active. It doesn’t really significantly drop the current in the first minute and the flashlight gets much hotter than high mode, of course. Is the regulation behavior different in turbo mode because it is using the FET channel?

It just occurred to me the different behavior could be a result of the heat generated by the linear regulators. In high mode, the regulators are dropping ~0.4V which means they are dissipating ~1.2W at 3A. The FET, with 30mOhms resistance, is dissipating 0.6W at 4.5A. Maybe the driver is actually getting hotter in high mode than in turbo mode because of this local heat production?

EasyB, in your case the 4.35V cell may not be the best choice for efficiency at the lower levels. It is obviously heating the driver quite a bit.

Most higher performance cells will have similar behavior at 3A. For example the LG D1 discharge curve nearly coincides with the Samsung 30Q and Sony VTC5 at 3A for the first half of the discharge.

It is a compromise between linear driver heating and being able to stay in 3A regulation for as long as possible. Ideally, there would be a cell that had a completely flat discharge curve at 3.6V at 3A, to be most efficient. I chose the LG D1 for this flashlight because it provides the longest time in 3A regulation; the discharge curve levels out near the end more than most other cells, so you get more useful power out of it.

Just checked my S3 and M2 both at level 22. Temp at level 5. LG HG2.

S3 2.86 amps at start climbs to 2.95 in 10 seconds. About 50 seconds starts to drop.

M2 3.03 amps at start. Stayed there for over 2 minutes before I gave up.

I think this could be a host issue.

S3 is the shelf one and I feel the heat almost instantly.

M2 also shelf design , just barely warm after 2 minutes.

Thanks for your data. What about in turbo mode? If linear regulator heat is contributing the regulation, battery state of charge would also be important.

I just did a more careful timing of my light regulation. From room temperature, 2.93A. After 30 seconds starts to drop. By one minute it was at 2A.

Topping off my LG’s Will repeat Level 22 and Level 24 in both host.

Repeated level 22 with fresh laptop pulls. Set temp to 60c

S3 2.9 at start. 2 minutes dropping begins. 2.5minutes 2.5amps. 3 minutes 1.9 amps. Light is very hot.

M2 3.03 for 4 minutes without dropping. My meter auto offed. Light is warm.

Turbo Mode LG HG2 . Temp at 60c

S3 3.8amps. 2.5 minutes drop starts. 3.5 minutes 2.6 amps and dropping fast. Really Hot.

M2 5.4 amps. 2.5 minutes drop starts. 3.5 minutes 3.8 amps dropping slowly. Hot.

I don’t know why I am getting different amps on turbo. Both have spring bypass. Connections are clean. Could it be XPL HI in the M2 vrs Dedomed XML-2 in S3 ???

HMM,
Looks like the S3 on Turbo last 30 seconds longer than on level 22 before reaching temp.

Lights were tested standing on there heads. with hands partially cupping around while I held the leads to the tail. With better cooling level 22 might not be an issue on the S3 host. Going to guess at 2-2.5 minutes is when the heat rush reaches the driver on both host…

Thanks again for the data. So it seems to me like the heat produced by the linear regulators is significantly contributing to the temperature of the driver, causing it to step down quicker in high mode than in FET only mode.

My S2+ seems to step down quicker than your S3. Part of this could be host related, but I imagine there is also some variability in the temperature sensors.

How do you like the M2? There seems to be a good amount more thermal mass based on your measurements. How is the beam compared to the S3?

A problem suddenly appeared today. The light will randomly flicker and not reach full brightness in all modes using all 8 of the 7135 chips (modes 17-22). Using the FET channel, there is no problem. Also, modes using 1 7135 chip (first 16 brightness modes) are fine. It is not related to the thermal regulation because I’ve changed the temperature setpoint all around and it makes no difference.

Could one of the 7135 chip connections be bad and cause this behavior?

I had bought another H17F driver to use in another build, so I stuck this extra one in my S2+. I will see what mtnelectronics says about the malfunctioning driver.

Just to expand on this idea some more: with the LG D1 cell at about 3.9V OC, the current was just under regulation at about 2.8A in mode 22. The current was very close to the regulated value of 2.93A, but the regulators did not have to burn off any extra heat, so the driver temperature stayed lower and thermal regulation did not happen in the first 2 minutes. Then when the cell was charged to 4.15V OC, thermal regulation again happened in about 30 seconds in mode 22.

At close to full charge, a typical high performance cell will provide 3A at around 4V. The 7135 chips then must dissipate 0.45V*3A=1.35W, but this is only 0.17W per chip. This is within specs for the 7135; the data sheet suggests that at dissipation powers above 0.7W, additional heat sinking is required to regulate the junction temperature below 120C.

So the question is: because the temperature sensor is on the driver and not closer to the emitter, is the thermal regulation stepping down the current too much too soon? Maybe so, but it should only be a problem when the cell is close to full charge. Within several minutes at 3A most cells’ voltage will drop much closer to the Vf of the emitter and the regulators won’t have to dissipate as much heat. Then the driver temperature should be much closer to the pill temperature.

I can’t really think of a better way to do it other than have a separate temperature sensor much closer, thermally, to the emitter. What about an IR thermometer on the top of the driver aimed at the bottom of the pill? :slight_smile:

I did some testing of the low voltage protection and voltage indicator function. The low voltage protection behaves as the manual says, but the cutoff voltage for mine was 2.8V instead of 3.0V as the manual says. The current dropped, at 2.8V, until it reached 1.4mA, then stayed at that value for many seconds until I stopped monitoring.

Voltage indicator:
10 blinks: 3.86V - 4.00V
9 blinks: 3.70V - 3.85V
8 blinks: 3.60V - 3.70V
7 blinks: 3.44V - 3.60V
6 blinks: 3.30V - 3.43V
5 blinks: 3.19V - 3.30V
4 blinks: 3.08V - 3.19V
3 blinks: 3.00V - 3.08V

It has the capability to blink more than 10 and fewer than 3, these are just the voltage levels I measured closely. This pattern is pretty close to each blink corresponding to 0.1V above 2.8V.