Yeah. Unless you want to reduce cycle life drastically, a USB-C PD 3.0 port will never kick out a continuous 100W until the end. Unless they picked out an aggressive charging profile that charges at 100W power just until it reaches CV phase, which is not too healthy for the cells inside, even if they LG MJ1s.
We know the CV phase thing usually induces an additional top off delay largely dependent of the charging circuit output to battery path plus battery resistance. Since cells are spot welded that is usually quite low in a good powerbank, minimizing the CV phase time a lot.
By the way, the CV charging phase could be eliminated by monitoring cell voltage right at the battery terminals and cutting off charging right when battery terminal voltage reachs its maximum (usually 4.2V) and current tapers down to preset value. This is monitoring two conditions instead of one.
Revisiting this a bit, maybe I messed up myself somewhat in the above explanation. What I was trying to say is that if we monitor cell voltage right at its terminals and use this value for cut-off, the whole I × R drop from the charging converter's output to the battery/cell terminals is eliminated. You say it does not eliminate CV phase, but at least for low internal resistance cells its duration must diminish greatly, doesn't it?
By adding a way for our charger to accurately measure current path resistance plus cell DC internal resistance, we could fill the cell or battery in one constant current go by adding dV = Icharge × Rpathplusbattery to the desired and cut-off monitored charging voltage wherever it is, with absolute cut-off (no tapering whatsoever). So, for cut-off voltage:
Well, if you mean that the charger's maximum output voltage must be tuned for a higher than maximum permissible (4.2V?) charging voltage that is true. Albeit, the final battery voltage after cut-off should be right on the spot.
Yeah, but I guess the charge specifications mean to say charging should stop when cell voltage reachs maximum allowed charge voltage. Since my method infers absolute voltage drop caused by charger's output to cell pathway plus own cell/battery internal resistance… I get it. Cell/battery would get an Icharge × Rbattery overvoltage a its terminals, but just for the time it takes to reach charger's maximum voltage (with sudden cut-off).
I wonder if this would actually harm batteries due to the brief exposure time. It could also be accompanied by a slightly reduced final charge voltage as to diminish this overvoltage and for the lifecycle benefit.