I was curious if anyone has compared the charging circuit of a rechargeable flashlight and a real battery charger?
It seems the best way to check for a novice(myself) would be to
- Charge the flashlight with the flashlight.
- Run the flashlight in a “medium” draw mode and record the run time.
- Charge the flashlight with a charger.
- Repeat step 2.
Is this scenario reasonable?
That test won’t show any difference. Since one battery at the same voltage will have the same capacity. It doesn’t matter where it is charged.
The quality of charging circuit in a light means charging speed, efficiency and heat generated. The end result is a fully charged battery at 100% capacity unless the charger is faulty.
So, what I should do is:
- Drain the battery, say 50%.
- Charge with the flashlight and check the voltage.
- Drain the battery, say 50%.
- Charge with a charger and check the voltage.
I had thought there were 2 factors involved in charging a battery:
- The current voltage level.
- And the amount of charging current.
My thoughts(as a novice) were when the amount of charging current drops to a preset low value for a given voltage, the battery would be deemed fully charged.
As appears to be the case with my Opus c3100 charger, I see a charge rate of 400mA or 500mA and this gradually drops to about 20mA before the battery is shown fully charged. I do also see a voltage of 1.47 VDC for AAA and such, where the rechargeable battery rating is 1.2 VDC.
I did not expect a “good” charging circuit to stop charging based on voltage alone.
Li-Ion charging is completely different to Ni-MH charging. Yes it uses CC-CV charging and there are different phases of charging but they all should stop at 4.2V +/- 1%. Even if there is slight difference in stopping condition, as long as it doesn’t overcharge and damage the cell, it should give you more or less the same capacity.
1 Thank
Li-Ion charging is completely different to Ni-MH charging.
THAT was the factor I was not considering…Thank you kind sir.