How many amps can a flashlight pull ?

Most of my p-60 18650 type flashlights pull about an amp maybe 1.10 average ..up to 1.5.To me this seems normal .

And my cheaper lights do pretty poorly as wellsomewhere between 1/3rd and 3/4ths of an amp..again to be expected ..

So why do my AA tank 566 and 568 pull about 2.15...My small sun AA zy-c55 about 2.25...and the AA eastward YJ J09 a whopping 2.75.

( no wonder i like my Eastward )

is this common of AA lights? Is this because of the driver ? are they overdriven?

None of my lights seem to be chewing up the batteries..It seems to me if something is pulling 3 amps it would kill a AA in 3 minutes ,Am I wrong ? obviously they are not that power hungry ,,

I know this isn't a simple question .BUT..does anyone have a simple answer ?

Are the numbers different at the tail vs. the emmitter ....are they higher or lower at the tail ?

The meter is right It's a couple hundred dollar fluke ..

I blame myself ....

It also depends on the voltage, if you use NiMH, the driver sucks a lot of amps to reach 3,7v (from 1,2 of the battery) with a reasonable current.


So 2,2A from a NiMH is great: 2,2 A * 1,2 V = 2,64W, and to have the same brightness with a 14500, it should suck X A = 2,64 W / 4V, X = 0,66 A

3A should kill an alkaline in minutes, but not a NiMH, Lithium or Li-Ion (wich can be used with higher currents)

I should have explained that the users ability to process math or anything remotely complicated is severely impaired …

(please speak VERY slowly ...Idiot alert !!!)

Ok.. Watt is the unit of Power. Ampere is the unit of Current. Volt is the unit of electromotive force (or voltage).

W = A x V

The leds commonly uses 3.7V, so if you use a NiMH of 1,2V, the driver will draw more A to boost the V output. The power of input and output of the driver is the same, lets say, 3W (for an XRE-R2)

3W = 1.2 V x 2.5 A (input from the battery)

3W = 3.7 V x 0.81 A (output to the led)

Thats not right...

i need explainations like my dog .."BAD doG".."Go lay down "..Bad Romisen

So let me get this straight..I'm sure if we talk about this for about a month solid I'll get it ..

The amperage draw I'm seeing is dependant upon the voltage of the battery ??(pesky math )SO this explains the differing numbers I was seeing on bigger lights that had 18650 batteries .

So really we have no idea what true values are without a regulated power supply .I'm assuming the voltages sag or change once a load is induced ..Or am I wrong ? I was using AA alkalines and they were about 1.61 V. Does Amperage draw say anything about the amount of actual light produced ? Hey Brted,,i need a 6th grade science lesson .

Does a nimh battery have some advantage over an alkaline ? besides being rechargable ? I always percieved it's lower voltage as a deficit

Speak freely to the caveman

The amount of light produced is determined by the watts the led consumes, and the efficiency of it (and some other factors).

If you connect a led without driver, it will only light up if the battery provides around 3,7V. Thats a li-ion, or three alkalines (4,5v) or NiMH (3,6v) in series. The current draw here will depend in many factors, like the internal resistance of the battery, the kind of led, the kind of cables.. etc..

NiMH has very low internal resistance, so they can give a lot of current without beeing damaged. If you try to draw a lot of current from an alkaline, it will be damaged (internally) and will not give you all its power. Thats why some NiMH in series are used for high power leds.

If you are powering a flashlight with an XRE R2 led, with 1x18650, it may draw something like 1A on high (3.7W of power). If you use two 16340 in series, it may draw 0.5A, because they are 7.4V in series, and if the driver is good, it will give you the same power: 3.7W = 7.4V*0.5A.

Be careful because every driver is different, some of them will burn up if you provide it more than 4.5v.

When people say that a flashlight draws X amperes, most of the time they asume they are using 1 li-ion cell (3.7V), in any other case they specify, which configuration they are using.

If you use one AA (and the driver is good), it will try to draw a lot of Amperes to compensate the low voltage of it. NiMH are capable to last more time with higher current draws than alkalines.

Thank you .I didn't know that about nimh batteries and have used them for years.in fact I have some really nicely built AA nimh batteries that have 1100mah old dark green cells (sanyo I think) still work great just charged them up and ran them in a few lights just for fun ..hard to get terribly excited over 1100mah but they were high end in their day . I love made in Japan.

I know I’m jumping into this conversation SUPER late, but I was looking up info, trying to understand watts, amps, and volts myself. So based on what I’m reading the post above, if I had

Flashlight A - 2.8 amp driver with 1x18650 would produce 10.36 watts (2.8 x 3.7 = 10.36)
Flashlight B - 2.0 amp driver with 2x18650 would produce 14.80 watts (2.0 x 7.4 = 14.80)

If I’m understanding this right, flashlight B would be brighter while pulling less amps just because it’s using a higher voltage??? Is that right?

Boaz as a noob ...How embarrassing

This was almost 4 years ago. I’m sure whatever noob hat you wore then has been removed a long time ago. :slight_smile: I’m just trying to catch up to most of you guys. :~

What you are not considering is that the input and output of the driver are not the same thing.

Input: Volts x Amps = Watts
Output: Volts x Amps = Watts [where Watts is LOWER than the input wattage]

This is true in both cases and your example doesn’t seem to account for input and output not being the same thing.

So does it depend on the efficiency of the driver? Or just what the driver is designed to output? So lumen output depends on what the driver outputs plus the lumens/watt that the LED can produce?

It’s probably safest to try and narrow the number of variables you work with at once as you learn.

LED brightness can be influenced by many things, but if you are trying to wrap your head around drivers then the only ones you should be concerned with are amps and volts. As it turns out that’s a tricky relationship to explain to those who are not familiar with it: the relationship of voltage to amp draw is non-linear for LEDs. Take a look at these two example graphs in djozz’s post #51:

As you can see, increasing voltage a little bit will increase current a lot. Meaning that LEDs are very sensitive to voltage. We typically use drivers which allow us to establish a set current and then let the driver handle lowering the voltage to the appropriate level.

In your example earlier you had 7.4v input into a driver, as you can see this is not an acceptable output voltage for an XM-L2 or XP-G2. Typically a buck driver would step that down at something like 80% efficiency. So 10W input would net 8W output. As long as there is enough energy available the driver will typically be monitoring and maintaining the output current. Every other number will be dependent on that output current in some way.

Thanks for the explanation! That helps a lot. The graph is even more helpful in understanding the non-linear relationship. So if you can help me understand this, it would be great…

Say I have an XM-L2 and it’s driven at 2.0 amps using 2x18650s. If the driver is a buck and lowers it to your example of 80% it would be 5.9 volts. How would you figure out estimated lumens on the graph that you linked? I guess what I don’t understand is how to read the graph if the amps stay constant but the voltage changes. Or at a constant 2.0 amps, would the LED only work at a particular voltage? and increase in voltage wouldn’t do anything?

  • 80% is an efficiency figure. You apply that to wattage only here.
  • Remember, everything revolves around the specified output current in this scenario. Don’t worry about how it’s specified, just know that we can arbitrarily dictate to the driver [when it is built] what the current will be. Everything else will follow.
  • Work your way backwards from an arbitrary output current. If you know the output current then you know the voltage since they are related and you have a graph. If you know those things then you know the output wattage. If you know the efficiency then you know the input wattage… if you know the input voltage (actually you probably don’t, but let’s save that one!) and the input wattage then you must be able to figure out input current. At this point you’ve got all the relevant numbers, right?

(Not all of this applies in every scenario, I’m specifically discussing “switching” drivers here such as buck, boost, sepic, etc. Linear drivers will behave differently, as will DD drivers.)

And are most LED drivers constant current or constant voltage? or neither?

sigh … maybe I should just take a class on this stuff. :slight_smile:

LED drivers are all constant current. LEDs are rated/specified based on current. Everything revolves around the current. Wattage is often a secondary concern, although it’s actually very important.

Chose a point on the graph and control either current or voltage. You’ll quickly see that controlling current within a small region (say 0.1A) keeps Power (Wattage) in a much smaller range than controlling voltage at the same resolution (say 0.1V).

I can only think of two LED drivers which are constant voltage, they are both extreme outliers. (Suffice to say that I can think of many LED drivers.) Basically CV is a bad idea for an LED driver. That’s why we don’t do it.