No!, not to too complex, not to complicate, the simple to be the most useful.
Yes to measure impedance (internal resistance).
Yes measure capacity (in discharge).
Yes to storage load (3,7V ).
If load 3.6V (LiFePO4), 4.20 / 4.35V (Li-ion), 1.48V (Ni-MH / Ni-Cd and 2.0 Ni-Zn.
If manual / auto charge selection.
Long bays (slots) (if you can still change casing design), future 26,800?
Just wondering if there should be a timer cut-off (eg. if battery has been charging for over xx hours, stop charging since it means the battery may not be taking in to long to charge, indicating a bad battery)… Although selecting a very low charge current to a very high capacity battery will also result in charging for very long hours… so I’m not sure how this would be done.
This would be a new “mode”. It wouldn’t complicate existing modes.
When this mode gets used, it would top off the battery, discharge it, recharge it, then display efficiency.
I don’t think I’ve seen mention of this type of metric before.
I don’t think it would have much impact on wall-outlet based charging as the only different would be a little time. Charging from a limited supply like a solar cell though, I can see where you might want to know which cells will use the least source current to achieve a full charge.
I also don’t think this is an actual, distinct mode. What you’re asking for is just an additional calculation based on a standard Full Test. I see the biggest challenge being finding space on the display to show it.
Yea I thought it up.
For the flashlight community, you are right, efficiency is just a matter of a little more time on the charger. But for the DIY Powerwall community, battery efficiency would be very interesting and could identify poor performers.