Hello, usual ‘3.7V’ Li-Ion batteries show in their data sheets: 2.5V cut-off voltage for discharge and 4.2V for charge
I know about the meaning and consequences regarding the ‘4.2V’… I never charge up to 4.2V (but this should not be the topic of this thread….)
This ‘4.2V’ is the reason that I bought ‘3.6V’ LG D1 with 4.35V termination voltage (resp. 3.0V cut-off voltage)… meaning I ‘fully’ charge to 4.2V, but they are not fully charged…
OK, regarding charging and termination voltage everything is ‘under control’ and I know what I am doing.
But how critical is the cut-off voltage of 2.5V for ‘usual’ Li-Ion batteries resp. 3.0V for ‘4.35V’ batteries… keeping in mind that almost all flash lights or discharger cut-off around 2.8V or more.
Is the discharging 2.5V resp. 3.0V as critical (or even more critical) as the charging 4.2V resp. 4.35V ? I know, deep discharge is not good and damages batteries.
Edit: I am not talking about discharging lower than the spec sheet’s cut off voltage… My worst case would be exactly to that voltage…
From what I’ve read, it’s critical. It will physically damage your battery’s if discharged lower due to chemical reactions. This will cause the internal resistance to go up and the capacity down. It might even make the battery unstable.
I usually don’t discharge below 2.8V on regular chemistries.
Generally the life of Li-Ion cells is closely tied to just how far they have been discharged in between charges. So the further you discharge, the shorter the life expectancy. All that only charging a 4.35 volt battery to 4.2 volts does is to reduce the capacity, and possibly minimally extend the life. If the manufacturer recommends not going below 3.0 volts, there is probably a good reason, and odds are it is because of significant negative impact on the life expectancy of the cell.
If you want the maximum lifetime of your cells, don’t fully charge them, store them at 40% and don’t discharge them fully. But why would you do that? 18650 are readily available. If a cell is worn, just replace it.
If you keep it within specs, you can expect to get in specs lifetime. If you do all of the above you might get a bit more out of a cell. But again, in what application is this useful?
Not sure I understand what your asking then. In that link it shows the DoD Percentage to cycles possible until the battery drops to 70% of its original capacity. This says down to 3.0v is standard. Everywhere I looked says 3.0v
If your draining below the 3.0v then I say yes, its shortening the cells life. How much :question:
If you charged the cell just as soon as the cell went below 3.0v (hit the lvp) then I’d guess you would be keeping the damage to a minimum. I was always told that a cell left in drained state of charge would degrade faster than if it had been charged immediately after discharging. I have also seen never used Li-ion cells (sat on a shelf for 3 years) self discharge to below 2.0v. When charged up, act like perfectly new cells that just rolled of the assemble line. FYI - some 'new' Fujitsu laptop batteries
You may just have to collect the data yourself and let us know how it goes. Unless someone else has tested this before, but I haven’t seen any.
2.5V is the lower limit of the cut off voltage of most lithium ion cells, excluding LifePO4 and LTO cells.
It doesn’t really matter if it goes down to 2,5V, but how long it stays below 2,5V, or even worse, 2V.
The max limit for any lithium ion cell is 2V. Below 2V, it becomes critical to charge the cell with a lowest current possible you can use, being 50-100mA.
Otherwise, internal chemical and structural damage can happen very quickly if you charge it too rapidly when below 2V.
This is the reason many chargers now have recovery modes when the cell voltage is too low: if they detect that a cell has too low of a voltage to charge it, they will enter a special low current mode, and when it reaches 3V, it will resume normal charging.
Wellp, how many people actually run down a cell ’til the device itself cuts off?
Today, I felt my pocket getting a bit warm, and sure enough, somehow the tailcap of my light got pushed hard enough to click it on. :person_facepalming:
Figure it hadn’t been on that long, and I’ve been using it off’n’on (haha) over a few weeks, probably not more’n 15min total up until today, yet I’ll probably be topping it off sometime soon.
Think I only once had an S2+ cut off on me, after the light got visibly dimmer for a few min beforehand. Other’n that once, I don’t think I ever had a cell go dead whilst I was actually using the light.
Besides, I wouldn’t want to be carrying a light with only 15min runtime, in case I would need to use it longer.
For sure, after any heavy usage, I’d top it off anyway.
So… maybe it’s just me, but I almost never have a cell get run down nearly that low.
I know this isn’t exactly what the OP is asking for, but I’ll just expand on the above a little bit.
I was reading though the spec sheet of the Samsung ICR18650-30Q cell, and noticed something interesting on their recommendations for pack design.
Samsung specifies “Under 1.0V voltage, do not charge the cell”. They list 1.0v - 3.0v as “Voltage range of cell which shall be charged by Pre-charging” (charge slowly, I presume). 2.5v is the recommended low-voltage cut-off, and 2.0v as “shut down your BMS”.
They don’t mention any length of time a cell can sit near 1.0v. I find it interesting that they say it’s okay to charge a cell above 1.0v. Previously, I’ve only seen the 2.5v low voltage cut-off mentioned.
Cutting off at 2.5v may just be a way to make sure the cell stays above 1.0v, if it’s left sitting awhile to self-discharge.
Anyway, at least for the 30Q, it appears that cells are dangerous only if discharged below 1.0v.
But, yes, if it’s below 3.0v, charge it slowly until it reaches that threshold. Most chargers will do that automatically. And if it’s below 1.0v, especially if it’s sat there for awhile, maybe it’s best to throw it away. This applies to the 30Q cells, so I’m not sure if other cells are the same.
If you look at a lot of high drain cells they go lower then 2.5 look at the he2, he4, hg2 data sheets LG gets there mah by discharging down to 2.0v
And the 25r data sheet has pulse discharge graph cycle life going down to 1.5volts and like 60 amps. I’ll say it again like I always do randomly on some of these threads. Lithium should be respected. But the danger is overrated. I’ve recovered around 200 cells 1.5volts or lower out of tool packs. I’ve had some cells die completely but that was it no vent no nothing just stopped working shows no voltage and wouldn’t charge maybe 5 or 6 cells have died all the others I used until the capacity just got to low for my use. Whenever I recover deeply discharged cells I bring it up really slow it’s like 20 or 30mah once it hits 2.5 to 3v I hit it with 1 amp and check back every 10 mins or so. If it gets hot it goes back to recycle and I’ve only had a few cells I was worried because of heat during charging to er on the side of caution some you can tell a difference between just usual warm and when a cell has really high internal resistance. These new cells need around and over 400 degrees to 450 degrees to explode. And also when the batteries were assembled they had no charge. Who knows how long they sit before the factory gives them that initial 30 percent charge
They can sit at 1volt for a long time. I’ve found packs that have had to be sitting at .5 or 1 volt for several months maybe a couple years from the looks of it and the cells are fine. Actually seem to retain a good amount of capacity being in low voltage. I just found a 30q in a tomo I lost around 6 7 months ago. Last Friday it appeared. The powerbank was completely dead before it was lost Well it has parasitic drain. All the batteries were around 1.1 volts. It recovered to 2934 capacity and 49 mohm resistance according to my opus that’s before subtracting 30 like the manual states I’ve had some evva 5200 26650s drained down to 0 volts twice for a couple months in a light that shouldn’t have drained them. And I was able to recover the full capacity twice. The major risk was with the old lipo cells. Everything for the last several years is a hybrid cell. There’s always a slight risk, but I use a lot of cells and I take my chances since the failure rate is below .0001% for the big 4. Now Chinese cells can be just as good or horrible. A lot of Chinese cells don’t have ptc and or cid. Unprotected cells already have 2 means of protection built in. To much current ptc kills the cell, to much heat and pressure cid ruptures cell dies. They’ve kind made it idiot proof even with unprotected cells as much as they can. Keep away from fire and don’t carry loose in a pocket. And you’ll probably never have a issue. Never had a lithum cell vent. I’ve had nihm cells vent good cells like imedion less then 30 cycles and on a moonlight mode after about 5 seconds.
On a side note, This will depend on what current your discharging the cell at. If the discharge current is at the data sheet spec then 3.0v should be your cut off. Now if the cell is being discharged at 3 amps then the cell is probably not as depleted as the spec sheet current (580ma) down to 3.0 v versus the 3amps down to 2.7v. The cell voltage will bounce back alot after the load is removed.
Contrary to what some may believe, I've discerned that recommended cut-off voltage values aren't estimated to prevent cell damage “because you're too low and the goo is already reactimelting inside the cells LoL”. In fact, there is at least a cell with a recommended cut-off as low as 2V: the LG 18650 HE2.
Carefully observing discharge graphs' voltage curve profiles at different discharge rates I've come to realize that the discharge cut-off voltage is selected to be just low enough for the discharge curves to converge, this ensures maximum capacity delivery at all current rates. Look up at the graph and you can see the reason for 3V to be the recommended cut-off for this cell: at 3V all curves deliver within ±2.5% (≈2.78Ah to ≈2.91Ah) and going any lower barely improves this.
And here you can see why this and other high discharge cells need way lower cut-offs, main reason being the huge output voltage drop at the higher current rates because I × Rcell = dV. Just down to 2.8V is already iffy for the 20A curve since it only scores 2.24Ah, with 2.45Ah max at the lowest rate.
If the discharge test were to have been done down to 2.4V even the 30A curve would have hit above 2.3Ah, though of course the 30A curve is out of specs (overheating cells).
In this case down to 2.9-2.8V looks fine enough at up to 15A, and 2.7V for 20A but for it you already need to pulse discharge (or meltdown LoL).
This is in fact what I mean… running down to the cut-off voltage is for me only a kind of ‘shit happens once in a while’… so while I charge a battery 100 times, this case might only happen let’s say at a ratio of 10:1…
So I was wondering: is running down to cut-off 2.5V according to this ratio as worse as charging up to 4.20V (or even storing at that level)… or is 2.5V and 4.20V the same torture for a battery?
Meaning: is the negative effect of 100x cut-off 2.5V the same as 100x charging (and even storing) at 4.20V… I fear that the 2.5V case is even worse… but I don’t know for sure…