Because of temperature induced chemistry reactions inside the cell. As a general rule heat improves the performance to some extent, chemistry dependent of course. While lots of modern cells do not show increased capacity at larger discharge ratios, increased heat reduction in internal resistance is common.
That was my thought as well. I have seen li-ions that were cold and run under a 3 amp discharge actual slightly increase in voltage as they were being discharged because the internal heat from being discharged warmed the cell which gives it the ability to provide more current. Since the discharge current is constant the cell doesn’t have to work as hard warm so the voltage goes up slightly. A warm cell will give more capacity in a discharge test than a cold one in my experience.
No problem. It may just be the nature of this charger/analyzer (Lii-500). It consistently indicates lower capacity figures on all cells I test with it by about 5-6%.
The chart they posted (see below) shows them measuring capacity using various current levels, starting with as low as 0.5A and going up to 8A. At 0.5A, their tested capacity is 812mAh. The main difference is their cutoff voltage is 2.5V. Mine is 2.8V, but there can’t be much capacity left there at that point.
Notice that the “begin volt” is 4.198V. This is almost impossible to achieve even with cells fresh out of the charger, I assume they must have set the termination beyond 4.2V in order to start the test at such high voltage. Probably around 4.25-4.3V.
With the MC3000 you get an option of “d reduce” which is the equivalent of a reversed CV phase for discharging, maybe with a termination current low enough along with the high starting voltage the 800mAh+ capacity can be achieved.
When people say down to 2.5/2.8V do they mean under load or not? No one seems to make the distinction but i’m pretty sure it makes a not insignificant difference depending on the rate.