Boost drivers efficiency?

Hello.

I couldn't find any data regarding calculated efficiency of boost drivers in different setups especially relation between efficiency and the amount of voltage amplification. For example what is the average efficiency of a boost driver in a setup with one 18650 battery(~4.1 V) and ~6V led or with one 18650 battery(~4.1V) and ~12V led quad? How is it compared to buck drivers or linear drivers in general? Especially considering the fact that a boost driver can discharge batteries to a lower voltage.

Could anybody shed some light on this topic?

I have a cheap board that claims 96%, but it’s huge compared to the drivers in flashlights.

Subscribed.

Boost driver efficiency is largely dependent on the ‘boost’ factor, or how much it has to step up the voltage and corresponding current/power in watts. If you try to boost a single li-ion to 8.4v then your efficiency will be higher then if you’re trying to boost 3.7v to say, 12.6v then the efficiency drops quite a bit. If it says 96, figure drop it down 15 for losses in heat and parasitic draw.

The LED will be drawing a constant current and the driver has to keep up. The closer your source voltage is to the load’s power need, the less the driver draws from the battery, the less heat it develops and the better it runs. Say your led needs 12v and like a xhp35 and you’re trying to run it off a single li-ion, your driver will be almost tripling the voltage, unlike if you’re using a 2s or 3s setup. If your led is pulling 12v 1.5A like 18w, adding the efficiency drop 15%, your boost driver needs to supply 27w.

Linear regulators and direct drive Fets are much more efficient since they don’t increase or decrease the voltage and dont require inductors or coils with inefficiencies. If your led needs 18W it gets 18W and so on until the battery starts draining and lvp kicks in until it shuts it off.

Interesting discussion.

Yes, FET’s are almost 100% efficient, however the LED that is being over-driven is where the efficiency loss happens big-time. Much better to use constant current if you’re not using full output.

Linear drivers are good, but only when the battery voltage is close to the forward-voltage of the LED. Unfortunately, for most 3-volt LEDs, the battery voltage is well above what is needed, until the battery is almost empty. All that “excess” voltage gets burned off as heat in the driver, which can result in sizable loss of efficiency.

Zebralight seems to have some of the most efficient boost-drivers in their lights. I’m not sure what they claim as the percent, though.

Is there a formula to calculate an approximate efficiency drop depending on a boost factor? Or maybe an average coefficient?

Isn’t it 50% efficiency drop?(18W*1.5=27w)

It’s not an accurate representation of efficiency, unless you factor in things like battery life and heat. I have a 100W COB flashlight and its running on a homemade 4S2P li-ion pack made from laptop batteries on a 250w boost transformer. On full power it runs for maybe 50 seconds before the bms cuts the power. It’s drawing 10A from the batteries from stepping 16.8v to 34v. It generates a lot of heat and is really inefficient. If I were to run it on say 7S or 8S it would be a lot more efficient and my run time would be better.

Driver effincy itself is not point.
Usually you need fixed lumens output as long as possible.
Fet drivers will require pwm for usable mods, and led(s) are always in unefficiant point at U-I graph.
Linear drivers will eat the difference between cell drop and led voltage drop. From 35% with full cells down to few % with empty cells.
Boost and buck driver can be very efficient. Like 85 to 90%. Voltage boost or buck factor is not main influensor. Modern gadgets can be charged with 9v qc standart and then buck to single li-po cell, and internal controllers are more efficient than old ones (5v to 4.2v).

This is very true. New buck and boost circuits are more efficient than ever. That wall wart you use for your router, cell phone or other charger, laptop power supply take 115v or 240v ac down to 12v or 5v DC in a small package. Very little heat is generated from those unless you really push them. IGBT tech had made it possible to get 160A DC in a lunchbox size power supply.

I’ve seen this statement many times.
But I haven’t seen anyone actually measuring that.
However I’ve seen end-to-end efficiency of their flashlight and it’s good but not special.

“The losses from the driver, reflector and lens are very small. I got 117 lm/W on the H2 mode. This is the highest I’ve ever tested for a CRI90 flashlight. 80 lumens for over 21 hours is also quite a feat.”

These are maukka’s impressions. I don’t share them.

They also usually last longer in regulated mode than buck drivers due to lower resistance except cases where lfv emitters are involved.

Sorry, but I don’t understand what charging a cell phone have to do with boost driver efficiency.

I’m not sure what you mean. Are you saying maukka’s measurements are wrong, or that 117 lm/W for a regulated high-CRI light isn’t very good?

Sure, you can do better than a boost-driver. Just stick an LED in direct series with a 3v battery. 100% efficient. Done. No losses from lens or reflector, either. Regulation sucks, though.

According to Texas Ace tests, E21A quad with better R9 and much better tint, matched with 90% efficient optics (good but not special) and 90% efficient driver (good but not special) would be slightly more efficient at the same H2 output level and significantly more efficient at lower levels.
Note that TA lumens may be different maukka ones. Overall the setups would come close.
So…yeah, I view that Zebra as good but not special.

Lets fix fact that most hosts can not carry more than 4-5w for long time.
1.5A current, how much voltage is needed (with modern leds)? A little over 3v. How much capacity can be taken from cell between 3.2 and 3.0v? Few %.
Most power banks have single li-po or several parallel connected li-ion. You have 4v output that need to be boost up to 9v, after it goes buck to 5v which goes to cell charging circuit. And all this way is more efficient than old linear current limiter from 5v to cell voltage.

E21A quad
90% efficient optics
Good luck :person_facepalming:

Great in theory. Build it, and see if the actual output efficiency matches the theory.

https://www.youtube.com/watch?v=jYMGCLsytT4 - boost converter efficiency testing(different current, different “boost” factor).

I am not saying that linear drivers are the best all-round solution, but they can provide decent results even compared to buck drivers depending on the scenario.

Linear drivers make more sense with a LiFePO4 cell, where most of the cell’s run-time is spent around 3.2v. They make less sense with standard lithium-ion cells, where most of the run-time is spent at 3.7v. Unfortunately, that’s what they’re most used with. Cheap, and better than FET at low outputs, but not as good as a decent boost driver.