It’s crazy to me that modern Li-ion batteries, at least at the 14500 size don’t have much more capacity than the NiMH batteries.
I’ve experimented with recharging alkalines, and it’s definitely not worth it. They will leak badly within a few days after charging. You can top them up with about a 20% charge, as long as you do it while they’re half-full or higher. And you can do it a few times, but they’ll leak before you do it more than 4 or 5 times.
You have to keep the charge rate low, 200mA is okay. Otherwise, they won’t take the charge. Stop at 1.6v. For safety, do it somewhere that can take a mess in case they rupture during charging. I’ve never had that happen, though.
It’s fun to experiment. But it’s definitely not worth it economically, or for safety reasons.
Well, exceptions abound, but comparing good quality NiMH cells against similar quality Li-Ion cells, it seems that NiMH has twice the capacity in mAh, but Li-Ion has three times the voltage, so Li-Ion comes out just slightly ahead.
There aren’t any really good 14500 cells. 18650 (and maybe 21700) is where the sweet spot is now.
I wish a major company would develop the 26650 format. It seems plb and others are now. We can get 5500 and close to 6000mah with a low discharge rate. And it can do 20 amps+ Like the new shockli battery. But they don’t have access to the patents and technology like the big giants. I guess they could try to reverse engineer it. Like run a mass spectrum on the electrolytes to see the exact ratios and such. And the anode and cathode materials to see their make ups
Isn’t it just. For example you can buy a D cell in NimH with 10,000 mAh capacity, which is maybe equivalent to a decent 18650 Li ion. But far larger and heavier, and more expensive. Though you could strap x4 2500 mAh AAs together into the same diameter and shorter, for less money.
In the AA/14500 size though, which covers a lot of useful devices, they are comparable. It seems nobody is interested in developing 14500, I think it is a lost cause. But the AA NimH will continue to be the best, though I doubt much more improvement will come, it is very mature. (Watch out for “fat” AAs though, some won’t fit devices built to spec.)
Summary: in AA/14500, NimH has the advantage except for the highest currents, and is an affordable, safe and readily available cell, in alkaline or even lithium primary versions. A torch that can only work with the voltage from a 14500 is a bit of a dead-end (though I have two).
I have a lot of use for AA torches.
Coupled with the versatility and performance characteristics of the simple AA NimH, AA torches IMO are the most underrated of flashlight platforms.
What they can do plus all the advantages of their compactness still kinda astonishes me to this day. I mean we’re talking about something with only 1.2V behind it pushing over 200 lumens in some cases for a good duration.
Improvements in the NiMH chemistry are just icing on the cake and not really required when we already have those glorious eneloops.
Maybe part of the reason that NiMH (along with alkaline, NiZn, etc) is not seeing much development is because they aren't a very optimal voltage.
LEDs require, what? 3 volts or so? If you use Li-Ion, its not too hard to make drivers with around 90% efficiency. And lots of other electronics runs around 2-5v.
But sub 2 volt chemistries require boost drivers, which are not as efficient, and if you want to avoid them, you need to use multiple cells in parallel. Both of these options can be less appealing than using a single Li-Ion cell.
Please tell me if I'm wildly off track or something here. :)
I think the problem just comes from size. Each battery chemistry has an optimal cell size for max energy density and small cells just aren’t as optimal. Pretty sure that’s why 21700 was picked up by Tesla because the energy density is more optimal vs 18650s.
I thought the 21700 was for “packing” density, along with energy density to match.
Not everyone uses batteries for flashlights Obviously plenty of 1.5v applications out there right now.
I suspect economy of manufacture as well, i.e. much of the cost of making a cell must be in the casing materials and processing to assemble it, which is probably little different between 18650 and the “21-70” as Tesla name it. So the larger capacity cell should inherently be cheaper, per Watt-hour of capacity.
Just as e.g. AAAs cost nearly as much as AAs, per unit, despite having 1/3 the capacity.
I’m not sure what sort of simple LED driver gives 90% efficiency from a single Liion cell, far from it. FET direct drive is the worst, linear better, but you need a buck driver for best efficiency.
FET or linear simply burn off the excess voltage from the cell (compared with Vf of the LED) as heat. In the case of the FET, most of the heat ends up in the over-driven LED, operating itself at a less efficient current level, even when PWMed to lower brightness. In the linear it is dumped in the driver.
A boost driver for e.g. NimH is no more complicated or less efficient than a buck (they are almost the same circuit design).
Thanks for enlightening me. :)
I had gotten the impression that boost drivers were the most inefficient somewhere on this website, and that Li-Ion drivers are more efficient due to the higher voltages involved.
Do you have any suggestions for interesting topics about boost driver efficiency I can read, to learn more?
Direct drive is 100% efficient.
Not really, there is resistance everywhere and adding pwm will reduce the efficiency more for direct drive than for other ways.
Sure, no heat dissipated in the driver, but with e.g. a cell at 4.2V, an LED that needs say 3.5V for the desired brightness you are hammering the LED into an operating region way beyond peak efficiency.
PWMing it to lower brightness doesn’t change these facts.
There may be no wasted power dissipated in the driver, it’s instead dissipated in the LED, and the internal resistance of the cell, and other bits of the torch, springs etc.
Direct drive is the crudest and least efficient way of powering an LED, until the cell is discharged to the same voltage as the LED Vf., by which point it is pretty well empty.
It is not obvious how inefficient direct drive is. Looking at my example you might think the system is (3.5/4.2) = 83% efficient, but far from it, you also need to factor in the much higher current taken by the LED at the higher voltage, which you can estimate using e.g. djozz’s measurements of LED transfer characteristics.
The most efficient system is to power the LED with direct current, at the level needed to achieve the desired brightness. LED efficiency drops dramatically when over-driven by direct drive. In the limit you can easily reach the point where e.g. 50% more power gives only 10% more light.
Once you start modding with e.g. spring bypasses, better FETs, in pursuit of headline lumens figures in turbo for brief blasts, you are also further reducing efficiency in normal operation.
Don’t kid yourself, a FET driver is always pushing and stressing the LED at full turbo level (least efficient), and hammering the cell at higher instantaneous current leading to increased loss from internal resistance, even when it is PWMed back to much lower light output.
A linear driver can do this by burning off the excess voltage as heat, whilst operating the LED at an efficient point (generally the lower the current the more efficiently the LED performs). A buck (or boost) driver can do this more elegantly by dropping the voltage without wasting the excess as heat.
If you want to be pedantic, yes there is some resistance which causes losses.
So technically it is more like 99.9% efficient instead of 100%.
Other than that, direct drive is literally just like a wire.
The power going into the driver is the same as the power coming out.
Driver efficiency has nothing to do with how much current or voltage your LED is using, it is simply power out from the driver divided by power in to the driver.
Nonsense. The driver is essentially an impedance matching network, matching the characteristics of the cell, with those of the LED. It has to be looked at as a system.
A FET driver isn’t really a driver at-all, just an on-off switch, performing no useful matching function, leaving the cell+LED system mis-matched and mostly operating at it’s least efficient point.
Fortunately it mostly works quite well, but there are much better (more efficient) ways of doing it instead.
Watt-hours in, lumen-hours out, (Edit: integrated over the cell duration) is to my mind the best measure of efficiency, but not easy to characterise. That’s why HKJ’s cell measurements and djozz’s LED measurements are so useful for those curious to get a grasp of what’s really going on.