Why battery volts matter to drop ins?

I’m still having trouble understanding why the battery volts matter to the drop in? All drop ins have a volt range and the batteries loaded into the host can’t exceed the volts rating on the drop in. Why? If the drop ins have a regulator that limits the Amps it will draw at both upper end and lower end before the flashlight shuts off why does it matter if I have 2 CR123’s or 4? Can someone explain this without mentioning the words Ohms??

I’m not an electrical engineer. So the formulas pass over my head.

To actually understand this you’ll have to at least understand conservation of energy (eg energy doesn’t appear or disappear). It would be good to also understand that volts * amps = power. Assuming we lose some energy to heat through inefficiency in the driver, Power-in < Power-out, right? […]

Higher voltage will destroy components not rated for it.

EDIT: Your TV has a circuit which allows it to run on something like 100-250volts AC. Why can’t you run it on 500v AC?

The magic smoke will escape :wink:

Read this: http://flashlightwiki.com/Driver

Volts is just difference of potential between two points. This difference of potential cause current to flow from highest point to lowest. Imagine like two containers of water one at sea level and the other 10 meters above. If we connect this two container with a tube the water will flow from the container which is 10 m high at the one at sea level. The greater this difference in meters the faster the water speed will be. Lets imagine we have a small turbine that we put near the sea level container. That turbine will generate current. And the faster water flow the more current we will have from that turbine. But the turbine can hold a limited pressure. Lets say above 100 meters difference the presure will destroy it. That is the limit it can tolerate in the difference of potential betwen two containers.
You see the difference of potential cause moving things in our sample water, in one flashlight difference of potential between the two poles of the battery generate current to flow in the component for example swich driver and finally led. And when you put two batteries in series you basically increase this difference of potential with 2 from 4.2volt to 8.4 and this will increase the amunt of current in all the components so the led will e brighter, but as in the small generator example above there is an limit in the current flow this small part can tolarate before heat up and brake(the current that flow so the I is THE THING that heat up the elemts and do other things like the light in the led or what ever in other parts). When you read this driver is rated to work with 1 cell so 4.2 volts that mean that is the max difference of potential it can tolerate generally. increase that difference so the V and you will have more current above the limits this parts can tolerate and they will burn. So that the explanation and let me check…. yes no OHMS in here :stuck_out_tongue:

Let’s say the Vf of the LED is 3.7V and the LED will be driven at 4A current:

If you take 1 full Battery at 4.2V the driver has to “eliminate” (4.2-3.7)*4A = 0.5*4 = 2 Watts, which is bearable.
If you take 2 full Batteries at 4.2V the driver has to stand (8.4-3.7)*4A = 4.7*4 = 18.8 Watts, which translates to molten solder, strong electrical stench and an order for a new driver from e. g. fasttech.

Let’s check, yes, no Ohms involved… :wink:

I see this is more complicated than I originally perceived. I thought the driver was like a DC converter stepping down the electrical charge and only drawing what was needed to power the LED . I see that it is not like that at all.

So I guess this answers a second question for me. The more batteries I load without exceeding the Bulb/driver limit will produce the most bright, as long as the batteries are fully charged. Thanks

I see why some people shy away from flashlights with these unfamiliar batteries, and non standard LED bulbs. We have all grown up just having a regular bulb and standard AA, AAA, C or D batteries. But once you move into the LED world and want to push the performance of the LED it gets quite sophisticated quickly.

Now I have to dive into the rechargeable battery rules and limits. I had some old Dell laptop battery packs. Pulled one apart and harvested 6 out of 8 Sanyo 18650’s. So I need to buy a good charger and multimeter and see if I can charge and use these without damaging anything. I like most members here like to get maximum performance from my tools. But this is quite complicated until you really get into it. Thanks for all the help.

And no Ohms were harmed in the making of this thread. :>)

Both incorrect. Your original understanding was closer, it’s just that your understanding of DC-DC converters is limited. Please read comfychair’s link, then return with questions. Also see my post earlier - I wasn’t joking.

What is confusing is that the link starts with this.

The driver consists of the electronics that take power from the battery and send power to the LED. They amplify or reduce the voltage from the batteries to the correct level for the LED and also control the amount of current that can be delivered.

Which sounds like a power reducer that draws only the power needed and no more. Of course, later in the article it qualifies this statement. But I will read it a couple of times and then come back with questions. Thanks

Yeah, frankly it may not be a great article. I just don’t have a better one to point you at.

FWIW I think I recall that you are looking at the SF dropins. They are buck converters, so it may be easier for you to focus on that aspect.

Without investing some effort towards understanding the basics of volts, amps, ohms, etc., this is going to be impossible to understand at the level you're asking for.