I’ve had a low-severity light addiction for many years. I never get into the > $100 lights that some of you love, but I can’t resist cool $30-$40 lights.
That’s why I’m loving the S2+ (that I only recently discovered). I know it’s old news to most of you, but to me, this is a nearly perfect light. It’s compact, uses 18650 s which I have hundreds of, and it can be both a modest practical light, or an “OMG how is that much light coming from that tiny flashlight” show stopper, with the SST-40 + programmable 12-group 8*7135 driver.
But here’s what I don’t get… why the heck is it using a linear driver and not a switching driver. I haven’t really paid attention to driver tech in probably a decade, but even 10 years ago, 7135s were already considered outdated. Yes, they are dirt simple to parallel on a board, but they’re so inefficient.
Looking at the voltage curve for an SST40, at 3A, you only hit 3V. That’s means with a linear driver and a fresh 4.2V 18650, you’re turning 25% of your power into driver heat. That’s bad for runtime both in terms of battery capacity and overall heat.
7135 chips are small, but 8 of them still take up space, and you should be able to fit an SMD inductor plus switching controller, mosfet, current sense IC and diode, in the same space.
So why is it so hard to find an S2+ with a switching buck driver?