Hey how are you doing man? Always nice to do some out of the box thinking using physics. :+1:
I have one remark though. You wrote:
“… as it is an average of the power during a certain period of the discharge as compared to an instant reading.”
Yes. What you did was derive a specific average discharge rate as a result of resting voltage. The curves of HKJ’s tests are constant current indeed. Your actual discharge rate of a FET driven light is not constant. The higher the voltage, the higher the current/power/output. This means that with a fully charged cell with the lowest voltage sag you get the highest output. For example Sony VTC6 is a good candidate. And the more depleted the cell, the lower the voltage, so the lower the current and output.
This means that in your case the initial current when you start the test is higher than 12.6A, and right before you ended the test at 3 minutes the current is lower than 12.6A.
But it’s a start, and you can fine tune (most likely) to obtain better predictions.
Edit:
I’m going to derive a rough estimation what the initial current at 4.2V was. This is just to give you and idea of the order of magnitude we’re looking at. Post #962 of this thread: TK's Emisar D4 review shows the output of another single 18650 quad emitter FET driven light, the Emisar D4. The light is water cooled as well.
At start the light output is 100, and after 3 minutes (same value as your test), the light output has reduced to meh, about 90. Simplification yields a linear relation between output and current. Initial output was 100% and final output at t=3 minutes is 90. Therefore the average value of 12.6A is assigned to an output of 95. Hence at 100% (t=0 sec), the current is (100/95)*12.6 = 13.3A.
This is a pretty “vulgar” method, but should nevertheless give you a rough idea what kind of deviation you can expect from calculating the average current the way you did.