A different way to measure battery current in fet driven hot rods

The idea for this experiment was born from the necessity to find an alternative way to measure the current pulled by my quad emitter Convoy M2. When using my very very cheap multimeter, current on direct drive was reported as 6A, which seemed unlikely considering how quickly the light would get hot and how easily it ate through batteries :person_facepalming: . Another thing that made me think there was a problem was the fact that low voltage protection would trigger very easily when trying to take longer readings, hinting at the fact that the reason for the wrong readings may be the much too high electrical resistance of the the multimeterâ€™s leads.

I donâ€™t know if someone has come up with this idea already, so here I will explain the procedure I came up with as best I can in case someone else wants to try:
What is needed:

- a multimeter or some other way to measure battery resting voltage

- a stopwatch or some other way to measure time

- a glass of water, bigger is better

- accurate low current discharge curves for the battery you are using, I used these: lygte-info.dk/review/batteries2012/Samsung%20INR18650-30Q%203000mAh%20(Pink)%20UK.html

• the flashlight and a charged battery of course

Procedure:

- record resting battery voltage before putting it in the light (in my case it was 4.18V)

- put the light on turbo ad leave it underwater inside of the glass so it doesnâ€™t get too hot during the measurement, at the same time start the stopwatch

- after the amount of time you want to sample has passed (I did 3:00), shut off the light and pull out the battery

- donâ€™t record the voltage right after because if the battery has been discharged at high currents the voltage will climb back up over the following minutes. Take a reading every minute until the voltage stops climbing, then you can record it (3.98V for me).

- go watch the lowest current battery discharge curve (low current because although it doesnâ€™t take into account the additional energy wasted as internal resistance of the battery, you are using resting voltages and this should be the curve most comparable to those) to get an idea of the Ah consumed:

this was the curve I looked at and I extrapolated about 0.63Ah of used capacity.

• then to get average current during the test, divide that by the duration, in my case the result was 0.63 Ah / 0.05 h = 12.6A

So this little beast is able to sustain quite a lot of power when kept cool enough, in my hands the battery tube becomes too hot in less than a minute though :laughing:

Iâ€™m glad it worked, itâ€™s not as quick as using a multimeter directly but itâ€™s a different method with some advantages and disadvantages. Those that I can think of are:

- this method gives you a measurement of the complete flashlight system, using a multimeter in place of the tailcap can result in measurement errors due to the multimeter having higher/lower resistance than the tailcap switch (as was my case).

- this method gives you a more useful / practical reading, as it is an average of the power during a certain period of the discharge as compared to an instant reading (you can do average readings with a multimeter but itâ€™s difficult with high powered lights because they get very hot and you canâ€™t just dunk them in water).

• itâ€™s fun seeing how much your light can heat up the water

- because you are looking at a low current discharge curve for a relatively new battery, the power lost due to internal resistance is underestimated, even more so if you are using an old battery. Iâ€™d estimate the margin of error to be lower than 10% in my case, but this gets worse with higher discharge currents and weaker batteries

• itâ€™s more work

If you decide to try this method out, let me know the results, please let me know if you find an error in my reasoning, and good luck to anybody trying this!

(if anyone wants to know, the light tested was an m2 host with kiribaâ€™s quad copper spacer, 4 nichia 219c d320 on the mtn quad mcpcb, a carclo 10622 optic behind an ar coated glass from kaidomain, an mtn-17dd with 20awg positive wire and 18awg negative, tail spring bypassed with a 20awg, stock 1288 switch)

it reads smart, but I consider myself a newb
Praise for thinking out a new way anf testing it
Very interesting how the people smarter then me react
For now from me, for what it is worth, kudos!

Hey how are you doing man? Always nice to do some out of the box thinking using physics. :+1:
I have one remark though. You wrote:

â€śâ€¦ as it is an average of the power during a certain period of the discharge as compared to an instant reading.â€ť

Yes. What you did was derive a specific average discharge rate as a result of resting voltage. The curves of HKJâ€™s tests are constant current indeed. Your actual discharge rate of a FET driven light is not constant. The higher the voltage, the higher the current/power/output. This means that with a fully charged cell with the lowest voltage sag you get the highest output. For example Sony VTC6 is a good candidate. And the more depleted the cell, the lower the voltage, so the lower the current and output.
This means that in your case the initial current when you start the test is higher than 12.6A, and right before you ended the test at 3 minutes the current is lower than 12.6A.
But itâ€™s a start, and you can fine tune (most likely) to obtain better predictions.

Edit:

Iâ€™m going to derive a rough estimation what the initial current at 4.2V was. This is just to give you and idea of the order of magnitude weâ€™re looking at. Post #962 of this thread: TK's Emisar D4 review shows the output of another single 18650 quad emitter FET driven light, the Emisar D4. The light is water cooled as well.
At start the light output is 100, and after 3 minutes (same value as your test), the light output has reduced to meh, about 90. Simplification yields a linear relation between output and current. Initial output was 100% and final output at t=3 minutes is 90. Therefore the average value of 12.6A is assigned to an output of 95. Hence at 100% (t=0 sec), the current is (100/95)*12.6 = 13.3A.
This is a pretty â€śvulgarâ€ť method, but should nevertheless give you a rough idea what kind of deviation you can expect from calculating the average current the way you did.

Iâ€™m doing well, thanks
What you said is right, and as far as I can think of there is no way to get instant current readings using this method. On the other hand, I think the fact that a fet driver doesnâ€™t allow for constant output makes average readings more useful, because they give you an idea of how the light is going to perform over longer periods. In this case it gives me a pretty good idea of the average power I can get on turbo when I edc it, since in a dayâ€™s use I donâ€™t usually get the batteries below 3.9V.

Your method does seem to give you an idea of the current draw, but there are more accurate alternative methods.

If you have an analyzing charger (that tells you how much charge was put in after charging) you can use what I call the â€śgas tank methodâ€ť. Fully charge the cell then use the light for 1 min then recharge the cell. With the value of the charge put back in you can calculate the average current draw during that minute.

An easy way to measure the instantaneous current is with a shunt resistor. You are basically making your own low-resistance ammeter. Complete the circuit with a known resistance then measure the voltage across the resistance and calculate the current using V=IR. I use a length of wire that I measured to be 9.4 mOhms. This is about the resistance of a tail switch with bypassed spring. For this method to work you need to measure the resistance of the piece of wire you use. To do this you need at least 2 DMMs, one to measure the current through the wire and one to measure the voltage across it.

I think the shunt resistor method is for sure one of the best if you want accurate instant results :+1:

Using an analyzing charger has similar sources of error as my method, namely battery internal resistance is going to eat up different amounts of energy when charging compared to a quick discharge, plus some chargers arenâ€™t very accurate in measuring charge amount, so Iâ€™m not sure if it would be more accurate in most situations.

It would actually be really interesting if somebody with all the necessary equipment tested this methods just to see how they compare in accuracy

With the charger method you do have to worry about the accuracy of the charge counter. But you donâ€™t have to worry about the internal resistance issue you are talking about. Li ion cells have very high coulombic efficiency which means the charge (mAh) put in during charging is nearly equal to the charge taken out during discharge (greater than 99%). Energy is dissipated in the battery due to the IR, but this doesnâ€™t affect the charge measurement.

With your method I think the main source of error is that you are assuming your cell has the exact same discharge curve as the one HKJ tested. It is probably very close especially for new cells, but I would not be very confident that they are the same in general.

So for EDC how often do you typically need or use turbo, and what would be the duty cycle before the cell was depleted to your LV cutoff, i.e. ON for a minute, then cool down for xx minutes, then ON again, etcâ€¦

i like your average measurement method and think it is quite clever. Did you ever measure the temperature rise of a known volume of waterâ€”that might provide another way to back into an average current value if you are handy with calculatus eliminatusâ€¦

For edc I mostly use a mode between 10 and 50 lumen, with occasional bursts of turbo that donâ€™t usually last more than 30 seconds.
I too thought of measuring the water temperature when I felt it had gotten perceptibly hotter, but I think it only rose a few degrees so I would need to either use much less water or have a more accurate thermometer to get decent accuracy. Or use a calorimeter

I didnâ€™t worry much about it because my cells are rather new ( < 10 full cycles ), but I probably should have written something about it in the OP. And itâ€™s true that this makes the method bad for old cells, since itâ€™s hard to find data for older cells and almost impossible to confirm if your batteries have aged as much as any used in the data you find.

Nice and interesting test.
But imho clamp meter is best method and budget friendly (around 30:money_mouth_face:.