What is the best high capacity 18650 battery?

Okay, I see what you’re saying. FET drivers tend to make extra capacity less useful, since you need to keep the battery mostly charged to get good performance (even with high-drain cells). That is why I prefer constant-current boost drivers (such as Zebralights) where the entire capacity of the battery can be used.

Though, most of my lights don’t have boost drivers, so I tend to use 30Q’s in them. The GA’s are saved for the Zebras.

Even at max output (2300 lumens), I don’t think the Zebra Plus uses anywhere close to the 10 amp rating of the GA battery. I think it’s somewhere between 6 amps (full charge) and 8 amps (near empty). Maybe even a bit less.

I can’t think of a higher output single-emitter light that uses a single 18650. So, using a 10 amp cell (such as the Sanyo GA) in a single-emitter flashlight seems pretty safe to do. But, yeah, if you’re running it close to 10 amps, a higher drain cell might perform a bit better.

Edit: Looking at the graphs, at 5 amps the 30Q has 10Wh of energy, and the GA has 11Wh. I think the GA wins, except perhaps when you use it at really high drains. Though, the GA sags about 0.1v more than the 30Q, so a FET driver would like the 30Q more (but with less run time).

In cold temps only around 30% of the energy in a GA or other high-capacity cell is useable. High-drain cells like the 30Q still give you around 80%.

See here.

I use my lights more in the winter than in the summer…

That was a nice find, thanks The_Driver Further down there is a another chart and the Sony VCT6 was nearly as good as the 30Q in cold temps. LG HG2 was not far behind either. I use my lights far more in Winter too so I should stock up on more 30Qs.

Quote from Megalodon in that thread:

"being said, I personally think the 30Q more and more, according to different testing convinced me of the battery more and more and more .... No matter what kind of applications. It seems to me that the 30Q is the universal cell for everything. Amazing what Samsung has brought to the market!"

No, your misunderstanding.

FET drivers do not tend to make extra capacity less useful. I don’t know where you got this idea from. Where did a different driver design come from? I was only talking about boost drivers.

Take a look at this battery comparison.

Regardless of the driver design, the high capacity battery gives you it’s extra bit of capacity only below 3.3v (with a 5A load). So to get the extra run time, you need to be running your batteries down until the low voltage protection kicks in (which you do). Thats my first point and it has nothing to do with driver design.

Again, I see errors.

The GA cell is rated for 10A continous, but it can actually do more than that. It can pull 15A, but not continously as it gets too hot. The way a boost driver works has a lot to do with feeding it amperage at a high voltage. Voltage losses in the springs, etc… can cause the boost driver to not maintain turbo for very long. The higher you can keep the voltage (while under load), the longer it will run at turbo (excluding heat related step downs).

An 18650 cell is a set container size full of the chemical mix of elements (lithium, etc…). If the manufacturer wants to maximize the batteries run capacity he can’t make the package bigger, he has to instead tweak the mix of battery chemicals to get it. This compromises other characteristics of the battery such as limiting it’s max discharge ability and it having more voltage sag under load.

Even if a boost drivers max battery draw is only 8A, the extra voltage sag is what can really hurt it. It reduces it’s ability to run at turbo for long periods. If you needed to run it at turbo a lot (like I need to), you would want the higher drain 30Q battery. Since you run it at only 700 lumen, you don’t notice this reduced turbo run time.

Here is a cool chart Maukka did on the Acebeam EC65 which uses a high power boost driver. He tested a high capacity Acebeam battery against a high drain Samsung 30T.

You can see that the Acebeam battery only allowed full power turbo to be activated one time for about 2.5 minutes. After that it’s voltage was too low to allow it any more. The 30T’s lesser voltage sag allowed six more applications of full turbo!

You can also see that the higher capacity Acebeam battery allowed for more run time at the lower levels. Normally it would not be that much extra but this comparison was between the 30T which is only 3000mah and the Acebeam cell which was about 5000mah. So a big gap in capacity. With a 3000mah and 3500mah cell it’s a smaller gap. Still, the high drain will allow for more turbo run time while the high capacity will run a bit longer (as long as you let it run until LVP kicks in)

I personally would rather run the Zebra Plus with a 30Q because turbo times are more important to me than maximum low level run times plus I don’t like running the voltage down too far. So it all depends on how you use the light and your needs.

I forget my point, but I hope this all makes sense.

Your Zebra Plus is not single emitter. It’s a quad die. There are many other higher powered 18650 boost driver lights out there.

I’m not sure where you are getting the 10 and 11 Wh from. Can you explain?

When I look at the graph I posted above I see the 5A curve showing the 30Q high drain delivering more power from 4.2 volts down to 3.3 volts. From there on down, the GA has the advantage.

At 7A the crossover point is 3.2v

At 10A the crossover point is 3.1v

The FET and the Boost both prefer the high drain battery. Both driver designs will deliver the same amount of light on the same battery, it’s just the way they deliver that light that changes.

With a boost driver you get steps in output. It’s either steady output or no output (not counting thermal protection ramping it down).

With a FET driver you don’t get steps, per se. When you remove the thermal related stuff, you get a steady sloping downward curve as voltage drops. The FET starts at a higher output, that’s it’s advantage. The Boost driver starts at a lower max level, but it’s a steady output. That’s it’s advantage.

For your particular needs with that particular light, the GA makes more sense over the 30Q. It lets you squeeze a bit more run time from the lower levels. Plus you don’t really miss the reduced turbo runtimes. It’s a win, win (for you).

Whew, that’s too much writing. Let me know if you have any other specific questions.

Yes, they all use the same type of chemistry and thus behave similarily. I would always get the cheapest of the three. The differences are too small to be of note.

If it’s snowing outside, bring your flashlight inside the house. LOL

What is worrisome are the Tesla cars that use this high capacity battery formula in a very cold climate. You may have to plug in a heater of some kind to keep the batteries from getting too cold. Maybe they already have built-in heaters to circulate warm water around the batteries? Maybe this heater can be activated powered by plugging in the home-based charger? I don’t know.

I think they have a heating function built-in. Otherwise they would never pass automotive tests.

I don’t buy it. I used my Zebralight SC600w MkIV HI plenty this past winter in the cold. Usually around –10C. I used the Sanyo GA, Samsung 30Q, and Sony VTC6 in it. I didn’t notice any difference in run time. Granted, I didn’t test it, but I’d certainly notice a difference between 30% and 80%!

I suspect it’s only an issue if the battery is kept constantly cold, even during operation. Since the battery warms up while I’m using the light, the usable energy isn’t that different between different battery brands. It’s interesting information, but I’m not sure how useful it is in real-world use.

Yes, they do, because by the time the battery is 30% or 40% drained, I need to recharge it because the output on the light has dropped so much. FET drivers are great on a full battery, but they suck as the battery discharges.

With a boost-driver, I can use 100% of the capacity, because it regulates output regardless of the voltage level.

Then it’s not a very good boost driver. I’m not sure where you’re going with that. On my Zebralights, they can deliver max output until the battery voltage drops to 2.9v. At that point, the battery is almost completely drained.

It will work just fine at full output. Yes, there’s an additional 0.1v of voltage sag, but the boost driver compensates. The extra capacity of the GA cell makes up for it. It’s watt-hours of energy that matters with a boost driver, not amp-hours or voltage under load.

Again, it doesn’t matter. The boost driver will work at any cell voltage more than 2.9v. At a 5A load, the GA cell has 11Wh of energy (as seen in HKJ’s written review), compared to 10Wh for the 30Q.

At 6-8 amps, the difference may be smaller, so I suspect there’s not much of an advantage to the GA cell if I were to run it on turbo constantly. But, I don’t do that (it would overheat unless it’s cold outside), so the extra energy in the GA cell does make a difference.

:nerd_face: That’s being a little pedantic. I think you know what I meant when I said it’s a single emitter. I can only think of the XHP70.2 (which I’ll call a single emitter) which may require more than 10 amps of current from an 18650, but you’d have to be over-driving it I think (or pretty close). But, yeah, if I had a 1x18650 light with an XHP70.2 emitter, I’d probably use a high-drain cell.

It’s in his written reviews. Unfortunately, he only measures the total energy at drain levels up to 5A. But that’s probably a reasonable drain level for most 1000-1500 lumen lights.

The FET driver is great when the battery voltage is 4.2v. It sucks by the time the resting voltage drops to around 3.8v. The boost driver doesn’t care; it just runs at constant output.

No, that’s not how a boost driver works. It can start and continue at any output the driver is designed for, regardless of the battery’s input voltage. It can do better than a FET driver even on a fresh battery, if it’s designed to provide higher current to the LED.

They’d have to for charging anyway. You can’t charge safely below 0C. They probably use it for operating temperature, too. Or, maybe when the car is running the batteries heat up quickly enough by themselves?

Tesla has added a pre-heating function to their battery packs to preserve capacity in cold climes.

The cars are quite popular in Norway, so whatever performance degradation that still exists isn’t a strong deterrent to their adoption.

Do you have to wait until they heat up, before driving?

Well, I tried to explain it the best I could. That’s all I can do. :slight_smile:

There is nothing to buy. The test data is right there. You can compare it to the data from HKJ.
I guess you only use your light in the high-modes and/or keep it warm with body heat. If you put the light outside for a few hours in the winter and then turn it on in the medium modes (lets say you only want 200 lumens) you will notice a difference if you actually use your light for a few hours.

Many people also have this problem with their smartphones when they go running for example and it’s really cold outside.

Yes, I’m not doubting the information that lithium-ion energy goes way down in the cold. It’s the “Samsung 30Q only goes down 30, while Sanyo GA goes down 80” that doesn’t sound right, at least in real-world use.

I notice in the testing, that they deliberately keep the batteries chilled to –8 degrees while performing the test, and run the batteries at very high current (7 amps). I think that is the problem with their test.

1. Batteries will naturally warm up during use (especially at high currents), and

2. 7 amps is probably beyond what the GA cell can do in extreme cold, while the high-drain 30Q still performs okay because it is high drain.

If this test was done at lower currents (say the 200 lumen test you propose), I suspect the results would be very similar between the 30Q and the GA batteries, because the GA battery would be operating well within its specs.

I’ll try the 200-lumen real-world test next winter. The only issue is that it takes about 12 hours (at room temperature) for the Zebralight to drain a cell at that output. But, I suspect I can do the testing in under 6 hours, due to the cold.

Liion Wholesale is having a Clearance Sale in preparation of a move…

If anyone needs or wants any Protected Button Top LG MJ1’s …… they have a few left after I got through. :wink:

They have others on Clearance also.

Shipping seemed very reasonable too, considering I bought a couple dozen 18650’s & am in the USA. :+1: … :wink:

It works in tandem with the cabin pre-heating feature, and can be activated with the app prior to departure. These newfangled cars!

Suffice it to say, the engineers aren’t oblivious to cold conditions.

A lot can also be said about the company, and how it operates, etc., but Tesla’s battery engineering is first-rate and it is considered a leader. That might change when the other OEMs come online and demonstrate their prowess, but I suspect it will still be competitive at least, with the head start it has gained in battery and charging infrastructure.

Back to your regularly scheduled programming…

I like Tesla, but I don’t think they’re going to survive once other “real” car manufacturers catch up. Musk hasn’t met a production target on the new model yet. He’s not going to be able to compete against other companies that can pump out millions of cars, when he can pump out only thousands.

I’d like to see other companies purchase Tesla’s battery tech.

i actually like the lg for some reason. it’s the one i have the most of, recommended to me the most. and failed the least. that being said, i vote for the vt6a!

In general Sanyo/Panasonic, Sony, Samsung, LG ratings are closer to spec but often times they are overrated and do not reflect real world usage scenarios, therefore it is in a way BS. It’s the same as the lights from reputable brands such as Acebeam, Olight, Fenix, etc. measure 0-15% less than their specified output.

That’s why we need HKJ and Mooch tests to know how batteries perform in the real world. Also user tests are helpful too to confirm manufacturer’s ratings. Many batteries tested on my Xtar VP4 Dragon Plus reveal less than advertised capacities, with some exceptions like the Sony VTC6 that measures practically spot on or a bit higher than rating and Shockli 26650s that measure a good margin higher than the specified rating.

With that said, I’m wondering if anyone done any testing or have any experience on the Samsung 36G 18650 3600mah battery, which is on sale today for $4.99 at 18650batterystore.com ? I couldn’t find any reviews of it online.

I wouldn’t get it. All 3600mAh cells do achieve their ratings. However, since their internal resistance is higher than 3500mAh cells, they have less real world capacity than their lower capacity counterparts.

TLDR: Just get Samsung 30Qs for everything.