How critical is the 2.5V cut-off voltage for Li-Ion battery life?

As worse? Quote me at least a single reputable source which claims this is detrimental for the cell and supports this belief with some sort of evidence, studies, articles, etc.

Cheers :-)

Here’s an interesting article (don’t know how scientific it is) that refers to BU-808: How to Prolong Lithium-based Batteries - Battery University

They strictly say: “Discharge depth has zero effect on battery life time

Barkuti is right.

The time spent is what really matters as long as you don’t go below 2V.

If you say spend 5-10 seconds at 2.0, it isn’t a problem.

If you stay at say 10 days at 2V, now that is starting to be problematic.

That’s the common belief. I have not seen any evidence of that. Anyone got a link to some testing that shows this? Barkuti’s link showing that discharging to 2.5v is perfectly fine, is very interesting.

My own long-term storage voltage is 3.7v - 3.75v. But, I’m only basing that on hearsay. It would be nice to see some testing that shows it’s the “best” voltage.

There are studies based on depth-of-discharge. I don’t know if something like continuous 0-30% was ever tested.

In the article linked to above Battery University notes that cycling in mid-state-of-charge voltages will lead to the best longevity.

Once I did a driver mod and somehow ended up with a 100ohm resistor across the battery. After leaving it on the shelf for a few weeks, I went to charge the cell (30q), and it was at 1.6 volts. I thought for sure it was fried, but I charged it back up anyway. However, I was pleasantly surprised when I measured the discharge capacity at 2850mAh. :laughing:

Here’s the TO again… and the question still is: How critical is the 2.5V cut-off voltage for Li-Ion battery life?

What I mean: is discharging down to 2.5V the same as discharging to - let’s say - 3.6V ?

I could imagine that keeping the battery as 2.5V might have a negative impact (like storing at 4.2V).

But if you discharge down to 2.5V resp. 3.6V and recharge soon… is there a difference regarding battery health?

The link you posted seems to show that discharging to 2.5v has no effect on health. But, they don’t show any testing that indicates if it’s okay to leave a cell at that voltage for the long-term.

The following paper seems to show that storage at 0% charge (around 2.5v ?) is better for retaining capacity than any charge level more than that.

http://jes.ecsdl.org/content/163/9/A1872.full

(See figure 2)

I’m not sure I completely believe it, though, without some other studies to confirm.

Ha, ha… I started this thread because I thought that ‘deep’ discharge down to the cut-off voltage is even worse than high voltages… but so far even the BLF experts don’t get to clear answer…

That’s because there doesn’t appear to be a lot of research available on modern lithium-ion chemistries. Most of the standard rules about lithium-ion cells seems to be very old, back when the chemistry was mainly LiCoO2, without the manganese and nickel that they all use now.

I think cells are safer and more robust now, than they used to be.

That’s what I was saying in a previous post. I have not seen or know of any test that have been done besides the batteryUniversity info.
I have heard a lot of myths over the years and some of those have been false. Several years ago when I was a member on the other forum there was a guy there that everyone considered a battery guru. I learned a lot from him and most people took what he said as the gospel. I have no idea of his back ground but he seemed to know his stuff. The point is he was wrong on some stuff I have personal seen after several years of believing it, until I finally tested it myself and saw the same results from other members here also.
I don’t think anyone here can give you a exact factual answer. Test it yourself and see and let us know what you find. :+1:
I will add that the current load on the cell will have a lot to do with how well your test turns out.
There’s a lot of good info in this thread.

Mmmkay, straight crude answers for straight questions:

It is uncritical and safe.

Presuming you start off the discharge at ≈4.2V, only down to 3.6V is about 55% - 65% cycling but down to 2.5V is 100%.

I could imagine that too, and still doesn't makes it right.

After whatever voltage cut-off cells don't rest at those voltage levels, but more like 3.1+V.

Besides the extra depth of discharge cycling and wear, no.

Cheers ^:)

Well, ya wanna talk about weird…

The cells that came with the NexTorch lights I just snagged are rated 2000mAH (I think; topping off now so will look later and correct it) and are protected. Okay, so they clocked in at something like 2050, 2105, 2150, about the ballpark.

But man, when doing a capacity test at a stinky 500mA, from 3.5V they dropped like a rock. At 3.4V I could literally watch the voltages drop with each update (10sec?). Think below 3.1V or so, it just dropped so fast that even a 2.8V cutoff would’ve been in that 10sec window.

Gonna try ’em in a nice hungry light like the UT20 or SP32A, and crank it up to 11, see if I get decent brightness, if it’s resistance-limited, or if the protection kicks in. That should be interesting.

So I don’t think it’d matter with these critters at what voltage I’d let them sit, they probably wouldn’t be too happy.

Actually, no, the label says 2200mAH. So they’re a bit below spec.

Okay, moment of trvth…

Lights up nice’n’bright. Okay, so they’re usable in normal lights.

I’ve noticed that all my protected cells are basically empty below about 3.6v, whereas my unprotected cells work well down to 2.8v.

However, I doubt it’s the protection circuit that is the problem. Most of my protected cells are likely re-wraps of crap cells. The cell is probably to blame.

I don’t really trust protected cells. You never know what’s really under there.

Wellp, like I keep telling people, just give ’em to someone you hate.

What is a protected cell? :smiley: :partying_face: :confounded:

Lets face it, there are a lot of myths about this stuff, and it is clear from this thread that new ones are in the process of being made (ahem Barkuti, your hypothesis is just that, a hypothesis).

In the end though, as long as you are operating within the ranges specified for the cells, they are their to serve you, not to be served by you. If you are building a big pack into a car, or for an eBike, it’s probably worth doing some optimizations for cycle life. For single cells that you load and unload in the course of regular use? I don’t see it.

The only facts I know of to be true is the data sheet facts. They were tested and they are guaranteed by the manufacture.
When you run out of those specs listed in the data sheet there will be side effects. Most people will not care or notice because they will be buying and using a newer super cell by the time side effects even start. I buy several new batteries a year and especially when something better comes along. If your looking to build something you need to last a given number of cycles then the data parameters are there and guaranteed.

Maybe it’s just me, but I only really buy new “supercells” for lights that need them (Q8, etc.), and wring out every drop of performance from even crap cells in low-stress lights.

Eg, I’m not about to waste a 30Q or VTC5 in a headlight, so I’ll use a come-with cell, a laptop pull, whatever. Unless I go spelunking (yah, not gonna happen…), who cares if I have to top it off every few months whether it needs it or not?

Most of my perfectly behaved hi-po cells will already be aging out by the time I’d see any significant decrease in capacity.

So… while I try to feed’n’water my cells and keep them happy, I rarely if ever run them down, and just charge them to whatever the charger says is full. If I have laptop pulls that Just Won’t Die, I’m not to worried about even primo cells.

But those facts are not what I am referring to in my thread… 4.20V is also within the data sheet facts… but everybody knows that a cells says ‘1000 thanks!’ if you don’t stress it up to 4.20V.

The 2.5V that I am mentioning in my thread is also for many cells within the range… resp. it is the cut-off voltage…

But I think we all simply don’t know the answer to my question… re-expressed I could formulate: is going down to 2.5V as bad or even worse than going up to 4.2V.

My question only relates to within the specs of any data sheet.

I have no problem, if there is no definite answer… then at least I know that there is no answer…