I don’t buy it. I used my Zebralight SC600w MkIV HI plenty this past winter in the cold. Usually around –10C. I used the Sanyo GA, Samsung 30Q, and Sony VTC6 in it. I didn’t notice any difference in run time. Granted, I didn’t test it, but I’d certainly notice a difference between 30% and 80%!
I suspect it’s only an issue if the battery is kept constantly cold, even during operation. Since the battery warms up while I’m using the light, the usable energy isn’t that different between different battery brands. It’s interesting information, but I’m not sure how useful it is in real-world use.
Yes, they do, because by the time the battery is 30% or 40% drained, I need to recharge it because the output on the light has dropped so much. FET drivers are great on a full battery, but they suck as the battery discharges.
With a boost-driver, I can use 100% of the capacity, because it regulates output regardless of the voltage level.
Then it’s not a very good boost driver. I’m not sure where you’re going with that. On my Zebralights, they can deliver max output until the battery voltage drops to 2.9v. At that point, the battery is almost completely drained.
It will work just fine at full output. Yes, there’s an additional 0.1v of voltage sag, but the boost driver compensates. The extra capacity of the GA cell makes up for it. It’s watt-hours of energy that matters with a boost driver, not amp-hours or voltage under load.
Again, it doesn’t matter. The boost driver will work at any cell voltage more than 2.9v. At a 5A load, the GA cell has 11Wh of energy (as seen in HKJ’s written review), compared to 10Wh for the 30Q.
At 6-8 amps, the difference may be smaller, so I suspect there’s not much of an advantage to the GA cell if I were to run it on turbo constantly. But, I don’t do that (it would overheat unless it’s cold outside), so the extra energy in the GA cell does make a difference.
:nerd_face: That’s being a little pedantic. I think you know what I meant when I said it’s a single emitter. I can only think of the XHP70.2 (which I’ll call a single emitter) which may require more than 10 amps of current from an 18650, but you’d have to be over-driving it I think (or pretty close). But, yeah, if I had a 1x18650 light with an XHP70.2 emitter, I’d probably use a high-drain cell.
It’s in his written reviews. Unfortunately, he only measures the total energy at drain levels up to 5A. But that’s probably a reasonable drain level for most 1000-1500 lumen lights.
The FET and the Boost both prefer the high drain battery. Both driver designs will deliver the same amount of light on the same battery, it’s just the way they deliver that light that changes.
The FET driver is great when the battery voltage is 4.2v. It sucks by the time the resting voltage drops to around 3.8v. The boost driver doesn’t care; it just runs at constant output.
With a FET driver you don’t get steps, per se. When you remove the thermal related stuff, you get a steady sloping downward curve as voltage drops. The FET starts at a higher output, that’s it’s advantage. The Boost driver starts at a lower max level, but it’s a steady output. That’s it’s advantage.
No, that’s not how a boost driver works. It can start and continue at any output the driver is designed for, regardless of the battery’s input voltage. It can do better than a FET driver even on a fresh battery, if it’s designed to provide higher current to the LED.