Question for electrical engineers and people with more charger/battery knowledge than me: Is it possible to constant voltage charge nimh, say, maybe to 1.5 or 1.55v? Will there be a problem with too much current inrush current if the battery is fully discharged to 0.9v? Or problems when it reaches the 1.5v?
.
.
What I'm trying to do is make a device that will work on battery, on wall power, or while charging. My idea is to have the device (which has a large input voltage range) connected in parallel with the battery and power supply, such that it will work in all cases. Ideally with a power supply that can supply more than the device's required current, it would also be able to run while also charging the battery.
Iām not a electrical engineer but I know that the Negative Delta V will be what terminates the charge with NIMH.
It might be best to hunt a IC that does NIMH charging and go from there.
Might be useful here.
NDV is not a easy thing to detect.
Even my icharger 208b will miss it sometimes if the charge current is set to low.
As long as I never reach that āfully chargedā point, the current flowing into the battery should decrease as the battery voltage approaches the power supply voltage, no?
I donāt care if the battery doesnāt reach itās maximum charge.
I just canāt use a nimh charger because I need the āchargerā to keep delivering power to run the device, even if the battery is āfullā (aka at the same voltage as the power supply).
I would think that might work as long as you tested the batteries before hand to see where the voltage rest after a full charge. As long as your below that by a few percent I canāt see why it would ever reach the fully charged voltage if the supply was below it.
I would think it wouldnāt be any different than a lead acid battery in a car. Instead of a voltage regulator your regulating by the supply voltage.
My idea was, for example, attach the battery to a 1.5V power supply and then the battery will never go above 1.5v, so it will always be near fully charged.
Since it will have a bit of self-discharge, the power supply will constantly feed the battery some mA, but since trickle charging wonāt damage nimh it would be ok.
A few mA wonāt heat up the battery at all.
The only think I need to worry about is when the battery initially begins charging, if the battery is at 0.9v and the charger is doing 1.5 then there may be too much current at first, so I would need some current-limiting power supply.
Thanks for your input Letās see if anyone else has some thoughts on whether my idea could work.
1,5V is a bit too high for constant voltage, keep it at maximum 1,4V and the cells will last a lot longer.
I have a āGigasetā wireless Phone, the AAA nimh cells in that Phone donāt last very long because the voltage is too high.
They get constant voltage, but it is a bit over 1,5V/cell.
Ok sounds good, Iāll do a few tests between 1.4 and 1.5v to see what works best.
Iāll be using eneloops though, which are likely a lot higher quality than the crappy chinese cells that phone handsets come with, they may handle higher voltges a bit better but Iāll have to test to be sure.
Yeah thatās what I was thinking, the cell has only .05 ohms of internal resistance which would be 12A of inrush current when charging a 0.9v cell with a 1.5v power supply, way too high.
0.5 ohm resistor would reduce that to about 1A which is what many chargers do and is safe.
The resistor may slow down the charging process a bit but since the difference in voltage drops as the cell approaches full charge the resistor will be taking away less and less energy so it should still end up at the power supply voltage