Anything that will replace NiMH soon?

I’m not sure what sort of simple LED driver gives 90% efficiency from a single Liion cell, far from it. FET direct drive is the worst, linear better, but you need a buck driver for best efficiency.

FET or linear simply burn off the excess voltage from the cell (compared with Vf of the LED) as heat. In the case of the FET, most of the heat ends up in the over-driven LED, operating itself at a less efficient current level, even when PWMed to lower brightness. In the linear it is dumped in the driver.

A boost driver for e.g. NimH is no more complicated or less efficient than a buck (they are almost the same circuit design).

Thanks for enlightening me. :)

I had gotten the impression that boost drivers were the most inefficient somewhere on this website, and that Li-Ion drivers are more efficient due to the higher voltages involved.

Do you have any suggestions for interesting topics about boost driver efficiency I can read, to learn more?

Direct drive is 100% efficient.

Not really, there is resistance everywhere and adding pwm will reduce the efficiency more for direct drive than for other ways.

Sure, no heat dissipated in the driver, but with e.g. a cell at 4.2V, an LED that needs say 3.5V for the desired brightness you are hammering the LED into an operating region way beyond peak efficiency.

PWMing it to lower brightness doesn’t change these facts.

There may be no wasted power dissipated in the driver, it’s instead dissipated in the LED, and the internal resistance of the cell, and other bits of the torch, springs etc.

Direct drive is the crudest and least efficient way of powering an LED, until the cell is discharged to the same voltage as the LED Vf., by which point it is pretty well empty.

It is not obvious how inefficient direct drive is. Looking at my example you might think the system is (3.5/4.2) = 83% efficient, but far from it, you also need to factor in the much higher current taken by the LED at the higher voltage, which you can estimate using e.g. djozz’s measurements of LED transfer characteristics.

The most efficient system is to power the LED with direct current, at the level needed to achieve the desired brightness. LED efficiency drops dramatically when over-driven by direct drive. In the limit you can easily reach the point where e.g. 50% more power gives only 10% more light.

Once you start modding with e.g. spring bypasses, better FETs, in pursuit of headline lumens figures in turbo for brief blasts, you are also further reducing efficiency in normal operation.

Don’t kid yourself, a FET driver is always pushing and stressing the LED at full turbo level (least efficient), and hammering the cell at higher instantaneous current leading to increased loss from internal resistance, even when it is PWMed back to much lower light output.

A linear driver can do this by burning off the excess voltage as heat, whilst operating the LED at an efficient point (generally the lower the current the more efficiently the LED performs). A buck (or boost) driver can do this more elegantly by dropping the voltage without wasting the excess as heat.

If you want to be pedantic, yes there is some resistance which causes losses.
So technically it is more like 99.9% efficient instead of 100%.
Other than that, direct drive is literally just like a wire.
The power going into the driver is the same as the power coming out.

Driver efficiency has nothing to do with how much current or voltage your LED is using, it is simply power out from the driver divided by power in to the driver.

Nonsense. The driver is essentially an impedance matching network, matching the characteristics of the cell, with those of the LED. It has to be looked at as a system.

A FET driver isn’t really a driver at-all, just an on-off switch, performing no useful matching function, leaving the cell+LED system mis-matched and mostly operating at it’s least efficient point.

Fortunately it mostly works quite well, but there are much better (more efficient) ways of doing it instead.

Watt-hours in, lumen-hours out, (Edit: integrated over the cell duration) is to my mind the best measure of efficiency, but not easy to characterise. That’s why HKJ’s cell measurements and djozz’s LED measurements are so useful for those curious to get a grasp of what’s really going on.

That depends on what efficiency you look at, I like the lumen/watt definition and mostly direct drive is fairly bad at that, because it either overdrives the led or looses a lot of power in series resistance (And it is way worse than 99.9%). Overdriving a led may get slightly more brightness, but the price is much more heat and much lower efficiency. Some may believe that it is a good trade off, I prefer much longer runtime and a bit lower brightness. Reducing brightness with pwm do not improve the efficiency!

I didn’t say anything about PWM. Read the post.

Nobody was talking about luminous efficacy, only driver efficiency.
If you’re going to mix both the LED and driver efficiency into one then you’re never going to have anything that makes sense, because LEDs aren’t even 50% efficient.
So it would make 0 sense for you to say 90% efficiency a few posts up.

Direct drive is literally just a wire with an on/off switch on it.
That’s why it has 99.9% efficiency.
There is near 0 resistance.

That do not exclude it.

Burning a lot of power in the led is not very efficient, you always have to look at the total solution, i.e. from battery to light. Using the led as a heater is usual a very bad idea, only exception is if you are trying to break the record for the brightest light around.

Anyway direct drive is a wire with some resistance, I have not seen any good explanation why it has to be 0.01ohm or 1ohm or 1000ohm or why/why not you can switch the led on/off at a fast pace, before it stops being direct drive.

FET drivers are certainly not efficient. If I compare my BLF A6 (which uses a FET at high levels) to my ZL SC600w (which uses a boost driver), I can get twice the runtime from my SC600 at about the same brightness as the A6. And the SC600 has the advantage of a regulated constant output, whereas the A6 decreases as the battery drains.

Boost drivers win by a huge margin.

You can argue semantics about where the inefficiency comes from, but as a whole system, FET sucks in every way except the simplicity of the design. Which is why it is so popular for high output budget lights.

See? This is the problem. You’re now trying to bring an LED into the equation.
Unless you know exactly the Vf and current-lumen curve of the LED, the temperature it is at, the colour of the LED, the colour temperature, etc it is impossible to determine the efficiency.
This is no longer about the driver anymore, now you have another 20 variables you need to know in order to compare.
Having an XHP70.2 running at 200mA will be far more efficient than the typical XPL at 3 amps.
Having a green LED will be more efficient than having a white LED.
Having 100 LEDs will be more efficient than having 1.

Now you’re bringing subjectivity into it, do you want extremely high efficiency or usable output?

This is why when you look at driver efficiency, you do not just assume that “oh direct drive is always worse because the LED will run at higher output and less efficiency”
Driver efficiency is ONLY power out / power in.

Driver efficiency (a real driver, not a FET or a bit of wire) is how well the driver matches the cell to the LED at the desired operating point, improving the efficiency of the system.

It can either burn the excess voltage off as heat in the driver (linear), or convert it efficiently as a buck or boost circuit can, with switching, copper and iron losses.

To my mind, a FET, or bit of wire, is zero percent efficient. It does nothing, except connect the cell to the LED. Whether it is fully on (Turbo), or PWMed to lower light levels, it contributes nothing to improve efficiency of the system, and generally runs the LED very sub-optimally, and simply dumps the surplus inefficiently as heat in the LED, and in the internal resistance of the cell. Better cells (lower internal resistance) simply move the heat more into the LED, which as HKJ said, is not a good thing, except for those only interested in headline short-term output numbers.

What you say is 100% true, but also 100% irrelevant. What does it matter if a short wire is 100% efficient? The whole purpose is to measure the efficiency of a total solution.

If you have a real-world example of an efficient light that uses a FET driver, let’s see it.

…bad news for Duracell and Energizer users. :expressionless:

Light output and efficiency is not subjective, but is easily measured and is a very important metric with lights. With a driver one of the important parameters is how well it drives the LED and the “well” covers brightness and efficiency.
Efficiency of direct drive can be anywhere from below 1% to near 100%, this depends on what you include in it and how you define it, but it do not say how much battery power get converter into light.

You’re right, and that’s sad. (Umm, that 14500s are dead-ended, not that you’re right.)

14500s are not AA drop-ins, ’cause a lot of things won’t take well to being hit with 4V when they’re expecting 1.5V. Certainly not with series-connected cells. Motors (shavers, etc.) will run obscenely fast, hotwire bulbs will have a short fraction of the lifetime, Si-based goodies can let out the Magic Smoke™, all sorts of bad things can happen if the doodad in question isn’t designed to tolerate Li cells.

Joe Idiot who buys 14500s and pops everything he puts them in, will badmouth 14500s like crazy. WallyWorld isn’t stoopit enough to sell 14500s because of all the Joe Idiots running around loose. At least 18650s won’t fit into anything that’s not designed for them. 14500s? The Silent Killer of AA-based electronics…

Now, maybe flat-top 14500s would sell, doodads can be designed to have built-in protection that only cells with a nipple on top would make contact. That might work. But button-top 14500s are a specialty item, and no mfr is going to put too much effort into them except for niche markets like flashlights and… ummm… whatever else might use 14500s without damage.

Being extra pedantic, make a distinction between direct-drive and FET-drive.

DD is just crowbarring the LED right across the cell.

Sucks for Li-ion, but works beautifully for LiFePO4 cells. :smiley: 3.2V from shortly out of the charger to right before giving it up completely.

Here are a couple of pointers:

Edit: and one more:

Yeah, DD is literally turning the driver into just a wire that lets all power through.
Seems like the people above are having trouble understanding that.

Now they’re trying to bring LEDs into the equation and say “oh it’s not efficient because LEDs are less efficient at higher current” Well yeah but that obviously completely depends on what LED and power source you use and has nothing to do with the driver anymore.

The driver itself, just like a wire, has minor resistance losses and that’s it.
Anyway, the thread is derailed enough so I’ll stop trying to argue with those people :stuck_out_tongue: