What is the best high capacity 18650 battery?

This pic is a favorite of mine.

:confounded:

There is no sand for the "right weight"?... I'm disappointed))))))))

Still the best pic out there. Should be posted on every battery thread.

Clearly you need to do some research and look at tests of the cells.
Maybe then you can explain why some 3600mAh rated cell magically performs worse than a different brand 3400 or 3500mAh rated cell.

It would be ridiculous for me to say: “The SYNIOSBEAM’s output specs are complete BS. It’s expensive and huge, and it doesn’t even light up my back yard as well as the modded $20 BLF A6 I carry in my back pocket.” Right?

That’s pretty much what you are saying about battery ratings.

Manufacturers claims are based on measurements made under specified conditions. They almost certainly go to some lengths to make sure that their measurements are repeatable over short and long periods with regular, traceable calibration of their equipment. I don’t know what their sample size is, but I am sure it is more than a couple of cells.

HJKs tests are careful, he may not be controlling conditions are carefully as a manufacturer might, but I don’t think people should have any reasons to doubt his results. His results often differ from those in manufacturer datasheets not because their ratings are bogus, or that his testing is more or less careful than their; they differ because he tests to a different standard, and one more tailored to flashlight use. He also rarely/never, tests more than two cells, and the cells he tests are often of uncertain provenance.

Two of the major components of every flashlight, the emitter and the battery, are not designed and specified for the flashlight market. That makes it particularly important to understand the assumptions underlying the manufacturer specs. Independent tests to flashlight-oriented standards are very helpful in their own right, and for better understanding the mfg provided secs. Calling the manufacturer provided specs “BS” both cultivates ignorance, and makes the independent tests less useful.

Well you’re right, the cost per output is probably the lowest of any LED flashlight that exists :stuck_out_tongue: less than 800 lumens.
Obviously it’s not meant for lighting up back yards :wink:

Ok then maybe I should say “the manufacturer specs are completely irrelevant to our flashlight applications and a 3600mAh cell will not always outperform a 3500mAh rated cell so independent high current tests (1-10A) need to be looked at for the actual performance.”

The usefullness of high capacity cells are over rated, IMHO.

I think for most flashlights, a high-capacity cell is more important than a high-drain cell. Most lights won’t use more than about 5 amps from a cell. Something like the Sanyo GA (3500mAh 10amp) cell gives noticeably longer run-times than a Samsung 30Q (3000mAh 15amp).

Using my Zebralight SC600w MkIV Plus on my bike, set to a constant output of 700 lumens, I get almost 3 hours of run-time on a GA battery, compared to almost 2.5 hours on a 30Q or VTC6. And since I like to give myself 0.5 - 1.0 hours of extra time (to look for a good place to change batteries), that works out to a battery change about every 2 hours with a GA, or 1.5 hours with a 30Q.

The high-capacity 3500mAh cell means that on a 4 hour bike ride, I only need to change the battery once. With a slightly lower capacity 3000mAh cell, I’m probably going to change it twice (or risk cutting it too close). It makes a real difference.

I guess I should explain a bit further.

High-capacity cells are only an advantage if you run them all the way down to the low voltage protection and your using relatively low output levels.

I think both of these apply to your particular scenario.

I, on the other hand, tend to recharge my batteries once they’re down to 3.6 or 3.7 volts. That’s when they’ve lost the majority of their power.

I also tend to prefer longer max runtimes over longer lower level runtimes. The boost driver in the Zebralight SC600w MkIV Plus is going to experience higher voltage sag on turbo using the Sanyo GA which will reduce it’s total turbo run time. The 30Q will keep the voltage higher allowing for more total run time on turbo.

My point being that the average person just assumes high capacity cells are better because the number is higher. It’s only better in very specific situations, so people should choose the battery that best fits their needs.

I was trying to avoid having to say all this WalkIntoTheLight. Thanks for making me have to type it all out. :stuck_out_tongue:
:smiley:

Okay, I see what you’re saying. FET drivers tend to make extra capacity less useful, since you need to keep the battery mostly charged to get good performance (even with high-drain cells). That is why I prefer constant-current boost drivers (such as Zebralights) where the entire capacity of the battery can be used.

Though, most of my lights don’t have boost drivers, so I tend to use 30Q’s in them. The GA’s are saved for the Zebras.

Even at max output (2300 lumens), I don’t think the Zebra Plus uses anywhere close to the 10 amp rating of the GA battery. I think it’s somewhere between 6 amps (full charge) and 8 amps (near empty). Maybe even a bit less.

I can’t think of a higher output single-emitter light that uses a single 18650. So, using a 10 amp cell (such as the Sanyo GA) in a single-emitter flashlight seems pretty safe to do. But, yeah, if you’re running it close to 10 amps, a higher drain cell might perform a bit better.

Edit: Looking at the graphs, at 5 amps the 30Q has 10Wh of energy, and the GA has 11Wh. I think the GA wins, except perhaps when you use it at really high drains. Though, the GA sags about 0.1v more than the 30Q, so a FET driver would like the 30Q more (but with less run time).

In cold temps only around 30% of the energy in a GA or other high-capacity cell is useable. High-drain cells like the 30Q still give you around 80%.

See here.

I use my lights more in the winter than in the summer…

That was a nice find, thanks The_Driver Further down there is a another chart and the Sony VCT6 was nearly as good as the 30Q in cold temps. LG HG2 was not far behind either. I use my lights far more in Winter too so I should stock up on more 30Qs.

Quote from Megalodon in that thread:

"being said, I personally think the 30Q more and more, according to different testing convinced me of the battery more and more and more .... No matter what kind of applications. It seems to me that the 30Q is the universal cell for everything. Amazing what Samsung has brought to the market!"

No, your misunderstanding.

FET drivers do not tend to make extra capacity less useful. I don’t know where you got this idea from. Where did a different driver design come from? I was only talking about boost drivers.

Take a look at this battery comparison.

Regardless of the driver design, the high capacity battery gives you it’s extra bit of capacity only below 3.3v (with a 5A load). So to get the extra run time, you need to be running your batteries down until the low voltage protection kicks in (which you do). Thats my first point and it has nothing to do with driver design.

Again, I see errors.

The GA cell is rated for 10A continous, but it can actually do more than that. It can pull 15A, but not continously as it gets too hot. The way a boost driver works has a lot to do with feeding it amperage at a high voltage. Voltage losses in the springs, etc… can cause the boost driver to not maintain turbo for very long. The higher you can keep the voltage (while under load), the longer it will run at turbo (excluding heat related step downs).

An 18650 cell is a set container size full of the chemical mix of elements (lithium, etc…). If the manufacturer wants to maximize the batteries run capacity he can’t make the package bigger, he has to instead tweak the mix of battery chemicals to get it. This compromises other characteristics of the battery such as limiting it’s max discharge ability and it having more voltage sag under load.

Even if a boost drivers max battery draw is only 8A, the extra voltage sag is what can really hurt it. It reduces it’s ability to run at turbo for long periods. If you needed to run it at turbo a lot (like I need to), you would want the higher drain 30Q battery. Since you run it at only 700 lumen, you don’t notice this reduced turbo run time.

Here is a cool chart Maukka did on the Acebeam EC65 which uses a high power boost driver. He tested a high capacity Acebeam battery against a high drain Samsung 30T.

You can see that the Acebeam battery only allowed full power turbo to be activated one time for about 2.5 minutes. After that it’s voltage was too low to allow it any more. The 30T’s lesser voltage sag allowed six more applications of full turbo!

You can also see that the higher capacity Acebeam battery allowed for more run time at the lower levels. Normally it would not be that much extra but this comparison was between the 30T which is only 3000mah and the Acebeam cell which was about 5000mah. So a big gap in capacity. With a 3000mah and 3500mah cell it’s a smaller gap. Still, the high drain will allow for more turbo run time while the high capacity will run a bit longer (as long as you let it run until LVP kicks in)

I personally would rather run the Zebra Plus with a 30Q because turbo times are more important to me than maximum low level run times plus I don’t like running the voltage down too far. So it all depends on how you use the light and your needs.

I forget my point, but I hope this all makes sense.

Your Zebra Plus is not single emitter. It’s a quad die. There are many other higher powered 18650 boost driver lights out there.

I’m not sure where you are getting the 10 and 11 Wh from. Can you explain?

When I look at the graph I posted above I see the 5A curve showing the 30Q high drain delivering more power from 4.2 volts down to 3.3 volts. From there on down, the GA has the advantage.

At 7A the crossover point is 3.2v

At 10A the crossover point is 3.1v

The FET and the Boost both prefer the high drain battery. Both driver designs will deliver the same amount of light on the same battery, it’s just the way they deliver that light that changes.

With a boost driver you get steps in output. It’s either steady output or no output (not counting thermal protection ramping it down).

With a FET driver you don’t get steps, per se. When you remove the thermal related stuff, you get a steady sloping downward curve as voltage drops. The FET starts at a higher output, that’s it’s advantage. The Boost driver starts at a lower max level, but it’s a steady output. That’s it’s advantage.

For your particular needs with that particular light, the GA makes more sense over the 30Q. It lets you squeeze a bit more run time from the lower levels. Plus you don’t really miss the reduced turbo runtimes. It’s a win, win (for you).

Whew, that’s too much writing. Let me know if you have any other specific questions.

Yes, they all use the same type of chemistry and thus behave similarily. I would always get the cheapest of the three. The differences are too small to be of note.

If it’s snowing outside, bring your flashlight inside the house. LOL

What is worrisome are the Tesla cars that use this high capacity battery formula in a very cold climate. You may have to plug in a heater of some kind to keep the batteries from getting too cold. Maybe they already have built-in heaters to circulate warm water around the batteries? Maybe this heater can be activated powered by plugging in the home-based charger? I don’t know.

I think they have a heating function built-in. Otherwise they would never pass automotive tests.

I don’t buy it. I used my Zebralight SC600w MkIV HI plenty this past winter in the cold. Usually around –10C. I used the Sanyo GA, Samsung 30Q, and Sony VTC6 in it. I didn’t notice any difference in run time. Granted, I didn’t test it, but I’d certainly notice a difference between 30% and 80%!

I suspect it’s only an issue if the battery is kept constantly cold, even during operation. Since the battery warms up while I’m using the light, the usable energy isn’t that different between different battery brands. It’s interesting information, but I’m not sure how useful it is in real-world use.

Yes, they do, because by the time the battery is 30% or 40% drained, I need to recharge it because the output on the light has dropped so much. FET drivers are great on a full battery, but they suck as the battery discharges.

With a boost-driver, I can use 100% of the capacity, because it regulates output regardless of the voltage level.

Then it’s not a very good boost driver. I’m not sure where you’re going with that. On my Zebralights, they can deliver max output until the battery voltage drops to 2.9v. At that point, the battery is almost completely drained.

It will work just fine at full output. Yes, there’s an additional 0.1v of voltage sag, but the boost driver compensates. The extra capacity of the GA cell makes up for it. It’s watt-hours of energy that matters with a boost driver, not amp-hours or voltage under load.

Again, it doesn’t matter. The boost driver will work at any cell voltage more than 2.9v. At a 5A load, the GA cell has 11Wh of energy (as seen in HKJ’s written review), compared to 10Wh for the 30Q.

At 6-8 amps, the difference may be smaller, so I suspect there’s not much of an advantage to the GA cell if I were to run it on turbo constantly. But, I don’t do that (it would overheat unless it’s cold outside), so the extra energy in the GA cell does make a difference.

:nerd_face: That’s being a little pedantic. I think you know what I meant when I said it’s a single emitter. I can only think of the XHP70.2 (which I’ll call a single emitter) which may require more than 10 amps of current from an 18650, but you’d have to be over-driving it I think (or pretty close). But, yeah, if I had a 1x18650 light with an XHP70.2 emitter, I’d probably use a high-drain cell.

It’s in his written reviews. Unfortunately, he only measures the total energy at drain levels up to 5A. But that’s probably a reasonable drain level for most 1000-1500 lumen lights.

The FET driver is great when the battery voltage is 4.2v. It sucks by the time the resting voltage drops to around 3.8v. The boost driver doesn’t care; it just runs at constant output.

No, that’s not how a boost driver works. It can start and continue at any output the driver is designed for, regardless of the battery’s input voltage. It can do better than a FET driver even on a fresh battery, if it’s designed to provide higher current to the LED.

They’d have to for charging anyway. You can’t charge safely below 0C. They probably use it for operating temperature, too. Or, maybe when the car is running the batteries heat up quickly enough by themselves?

Tesla has added a pre-heating function to their battery packs to preserve capacity in cold climes.

The cars are quite popular in Norway, so whatever performance degradation that still exists isn’t a strong deterrent to their adoption.

Do you have to wait until they heat up, before driving?