Active feedback on PWM to current-regulate with FET?

I know, first post (here, not anywhere), so what do I know? Maybe not much. And maybe I'm way behind the ball on this thought, but a couple of questions and an idea:

How efficient is an led driven by an FET when it's being pulsed, compared to say a linear regulator, let's say in the context of single 18650 XM-L? The FET itself is more efficient but you're driving the LED at high current, for brief pulses. I guess the efficiency for the LED is more aligned with expectations from the average current though, not the pulse current, so my guess is the FET setup still wins, no?

I suspect this is why the bistro clickly firmware exists with an option for 50% power on its high mode. Get the efficiency of an FET, but without the battery draining 6A or whatever high mode that you maybe don't even want.

That's great maybe, but once the battery is down to say 3.6V, especially depending on the type of battery, a 100% mode is likely not 6A and 1300 lumen's anymore. It's maybe more like 1A, and 50% pulsed is then even less. So couldn't it be nice to have that 50% FET mode when my battery is full but maybe 100% when it's a bit low? Then I keep more constant output?

In other words, active-feedback on PWM to current-regulate with an FET, or some approximation of it.

The closest thing I've noticed is I think the H17F which regulates on heat. That's a reasonable proxy for current but it's going to have uneccessary short-term fluctuations that are unpleasing not wanted in this context. The goal here is controlled battery life and consistent output. I don't want the 6A even for a few seconds. I just want a controlled power level.

So ideally what I'm saying is monitor current and adjust PWM accordingly (preferably smoothly too). However I don't know if there's a way to monitor current. It seems like there are ways to monitor temperature, or to estimate it/fake it maybe. It seems like some software has ways to monitor battery level. That would be a start at least, increase duty cycle as battery level drops (up to some point where you might eventually want the opposite). But I guess if battery level can be monitored then current can be estimated more directly too.

Does this exist? Is it crazy? Is linear regulation more efficient anyway?

Updating first post with summary:

a) monitor battery voltage, assume VERY close to LED voltage, determine LED current from parameterization.

or b) change hardware to add better current monitoring.

pseudo code for a)

while true {

turn_switch_on(); // 100% duty factor during adc read.

sleep 100 micro-seconds; (maybe)

pulse_V=readadc(blah);

pulse_current = a +b*pulse_V+c*pulse_V^2;

duty_factor = min ( mode_current/pulse_current, 100) ;

enable_PWM(duty_factor);

sleep(30) ;

}

a, b, and c define the current vs voltage curve for a particular LED. c can optionally just be defined as 0 for a simplified linear curve.

Learning a little about how these ATiny's work. Ok, so I get now that there are three analog inputs that can be read by an ADC. I'm guessing in many of these drivers a couple of them are already used to read stars and memory capacitors, and one gets left to read battery voltage? Anyway, stars and memory are optional features and there might be a workaround.

I can imagine a couple of ways to do what I'm saying.

Option 1) Just read the LED voltage (which is close enough the battery voltage anyway, the important thing being it should be measured under load). This multiplied by PWM duty factor can provide the average current draw, but only for one known LED with known voltage curve, and with corrections for temperature (but I don't think this needs to be THAT precise, no power modes are).

Still it depends on the voltage curve of a particular LED, which is not quite ideal (and requires a reasonably accurate parameterization. That's probably ok though, and even an offset linear approximation is probably good enough here).

Option 2) Read the voltage drop across the FET itself. This requires an extra input. I don't know if this is big enough to read well. Actually, we should hope it's pretty small. Still it might be big enough. This is directly in the current path and directly proportional to LED current. This option requires a new board design. Advantages are that real current is measured and it's not dependent on what LED is used.

Either way, read it periodically and make a correction the the PWM duty factor to maintain the average target current. No convergence is required. A single (preferably slowly ramped) proportional correction should be fine, certainly until the next update.

Yes, linear regulation is more efficient as per watts/lumen. Many of us are not so interested in sacrificing output to have a lower regulation. High amps on these LED's is not efficient, but gives us max output - twice the power, 50% more output - we can be ok with that. Sure, as the cell drains, so does output. Good thing about PWM's is everything scales well, bad thing of course is efficiency. Part of the reason why the triple channel drivers are attractive - single 7135, bank of 7135's, and a FET -- great combo of amp limiting regulation and max power, plus we could PWM all channels so can implement smooth ramping across all 3 channels.

My opinion is current feedback to adjust PWM's to maintain a constant output is not an efficient way to go. Better to go with true linear regulation like so many of the better drivers have already, like the LD-2 driver: https://budgetlightforum.com/t/-/33706. I believe their limitation is you can't do smooth ramping because output levels are fixed - think that's the case...

Well I certainly know that 100% FET modes at 6A (18650 and XM-L example) are far less efficient in terms of lumens per watt than linear regulation at 2A, so ok.

Is it really clear though that an FET with PWM creating 2A average from 6A pulses is definitely less efficient than the 2A linear regulator would be? If there's data you can point at of the top of your head I'd appreciate it (that's not meant to be argumentative, I'd just be interested).

One thing that seems very clear to me is that improving thermal performance has a huge effect on reducing the efficiency turn off at high current. 2A average at 4.5 khz pulses produces the same temperatures in LED I guess as 2A constant, no? So I guess I was hoping 6A with 33% PWM would be quite a bit more efficient than 6A flat out is. Flat out it seems like it's something like a 25% hit compared to 3A (50% of the extra 3A though).

I'll believe the experts. I guess I'm just wondering is this intuition or is this really clearly known fact that linear regulation wins? Of course it depends on the mode. At very low power, pure FET PWM stays equally inefficient at worst (no? still 6A pulses), while linear gets worse (must shed more volts, if we're talking pure-linear low power, no PWM). I guess at 3A in this example, it does seem like a potentially close call at least.

Anyway, I was just about to buy some qlites but you may have changed my mind toward those 3-way drivers.

I think this is a good idea. I brought it up in the firmware development thread and there were a few replies/ideas.

I think you pretty much explained the options in your second post. To do it right a direct way to sense the current is needed, and while this is possible, either using a sense resistor and amplifier or a Hall sensor, it would require hardware redesign. An approximation of constant current modes could be attained with just some clever firmware modifications, but they would be specific to a LED and/or battery combination. For example, if you wanted a constant 4A mode, with a fresh battery that would be, say, 70% PWM, then if the battery voltage fell to 3.9V, which the MCU can sense, the PWM would be raised to, say, 90% PWM to maintain 4A. This could work pretty well, but it would require custom FW for each setup.

Regarding LED efficiency, I’m not sure PWM is as bad as we think. I think the current understanding is: if an XPL produces 2000 lumens at 6A, reducing the average current to 3A using PWM would produce 1000 lumens. But the XPL produces 1250 lumens with 3A constant current, so that is a significant efficiency loss by using PWM. Well, using linear regulation is also inefficient since the voltage difference is burned off in the regulator, though it’s not that inefficient.

However, there are two conditions which cause LED efficiency drop at high current: the temperature rise associated with the high current, and the high current density itself. The high current density reduces the electron-hole recombination efficiency; I believe Auger recombination is the currently accepted mechanism responsible for the loss of electron-hole recombination efficiency at high current. I think these two mechanisms, high junction temperature and high current density, can be considered separately. In other words, looking at the graph here:

Say the direct drive current is 6A, then reducing the current by PWM would not just linearly reduce the lumen output, it would produce a curve with downward curvature because at 3A the LED is not as hot as at 6A, so it would be more efficient at 3A than at 6A. But because the current pulses are still ~6A, the efficiency is not as high as if the 3A were constant current. So the output with 3A PWM might be 1150 lumens, higher than 1000 lumens, but not as high as the 1250 lumens if the current were constant.

I haven’t seen this aspect of the LED efficiency talked about or tested.

Easyb, I think you covered many of the exact same thoughts I've had. Seems like some data is needed. There might not be huge gains, but 5% here , 5% there, this is how things get 30 or 40% better.

It does seem like at say 3A the linear loss I guess about 10% 4.2-3.7 from my vague memory of XML's but the battery itself sags some too, so give or take. 6A is 25% less efficient that 3A when cooled pretty well as I recall (although I prefer to think of it as the top 3A are 50% as efficient as the first 3A). the question is as you say, is 15 of those 25% still from heat? It does seem like an uphill fight for the PWM, but it can only be answered with data.

The efficiency differences between PWM's on a FET and linear regulation have been proven here, probably several times, plus the theory and CREE data sheets support it.Early on in the LD-1 development, I believe led4power was one of those proving it with data - it substantiated why he spent considerably R&D on his development, and why his drivers are so popular here on BLF.

Did you check the links to the testing on the OP of the thread I referenced? Real world measured data is all there - djozz and HKJ did the tests. If that's not enough for you, well, I can't say anymore...

I just did some simple testing. With an eagle eye X6 with FET only driver and XPL HI V2 3B, I simultaneously tested current and output for different PWM modes (30, 50, 70, 100%). To measure the current, I completed the circuit at the tailcap with a 0.0094 Ohm piece of wire while measuring the voltage across it. To measure the lumen output I used my ceiling bounce test, so the lumen numbers are approximate, but the relative values should be accurate.

Below I have listed lumen values to demonstrate the approximate efficiency of PWM dimming. The first column is the approximate PWM %, the second column is the current that I measured, the third column is the lumen output I measured, the fourth column is the lumen output that would be expected if the PWM dimming resulted in a linear drop in output (ratio of the currents), the fifth column is the approximate output one would expect if the current were constant, estimated using the curve for the XPL HI here . I’m just using this curve to estimate relative changes in output; in other words it is like I scaled this curve by a factor close to 0.8, which, for instance, could be close to the efficiency of the reflector/lens system.

PWM*Imeas.*lin.**__const. curr.
100%5.34A1315lum———_———-
30%1.37A397lum337lum_458lum

I recharged the 30Q cell and switched mode groups (using guppy rev. 2).

PWM*Imeas.*lin.**__const. curr.
100%5.53A1384lum———-———
70%3.54A932lum886lum_1040lum
50%2.48A685lum621lum__803lum

So the actual output from PWM dimming is greater than if the dimming resulted in a completely linear drop, but not as high as if the dimming was constant current; pretty much what I would expect.

Now to approximately compare the PWM efficiency to a linear regulator efficiency. To output 685 lumens with PWM dimming requires (2.48A)(3.35V)=8.31W. For the linear regulator, outputting 685 lumens only takes 2.1A since it is constant current, so (2.1A)(3.27V)=6.87W in the LED, plus (3.80V-3.27V)(2.1A)=1.11W burned off in linear regulator leads to 7.98W total. (this voltage difference is between the forward voltage at 5.5A and 2.1A. The voltage difference would in reality be even greater taking into account the voltage sag of the battery, which would result in approximately an extra 0.1V burned off in the regulator)

So in this case, the efficiency advantage that the linear regulator has as a result of using constant current is mostly negated by the power dissipated in the regulator itself.

Edit: well, I (and possibly my DMM) have been confused by measuring voltage and currents with PWM. I think the power estimate above for the PWM dimming is not right. If relevant measurements with PWM have been done, I would be interested to see them.

I just read through these tests. I didn’t see any measurements of LED output for PWM dimming. In HKJ’s test the LD2 is quoted as 64% more efficient than PWM dimming for the medium mode of 1.2A, but this was assuming PWM dimming resulted in linear reduction of output, which my measurements show it does not. Also I think the 64% number did not include the power dissipated in the driver itself.

A couple of things easyb, as you mentioned in edit, measuring current draw in PWM might be tricky. One way around this might be to put an input capacitor on the battery. To be clear, this doesn't smooth out the PWM on the LED side, it just smooths out the load on the battery to make it more measurable. However Now we get into questions of PWM and batteries too. The really best test would probably have to be a full system test, which means measuring real world battery level after say 20 minutes on 2A for each, and/or things like how long can each run before dropping out of regulation, but you'd need our constant current PWM driver first.

Second,

I think you got that backwards. The battery sag is going to exist anyway, but it means LESS voltage has to be sluffed off by the linear regulator, and less of the loss is attributable to the regulator itself compared to what you calculated. There's a second order correction for the extra work the battery does and the extra sag, but the basic sign here is that this effect puts LESS blame on the linear regulator. PWM also has battery sag.

However, now comes the question is the battery less efficient in 50% duty factor at 6A than in 100% duty factor at 3A? Is there enough capacitance that it just doesn't notice the difference or do we get higher internal resistance losses for PWM?

Yeah, it’s called a buck regulator. More common for stepping down 2+ cells to drive an LED, but it’s easy enough to use the same design for even 1 cell.

Current is monitored by a sense resistor (low-Ω), and that’s a usually constant. Eg, a precision 1mΩ resistor would drop 1mV/A, so a desired 3.3A would develop 3.3mV across it.

Mostly, this is linear, not digital. There are ways to use fast digital ICs to keep up, like digitally-controlled lab supplies.

So yeah, it’s absolutely possible, and common enough for 1-mode regulated (non-DD) FET drivers. Just goggle “buck driver” and see what turns up, and check out the minimum voltage. A smart BD will PWM at higher batt voltages until it’s at 100%, then will just stay on (like regular DD) with gradually decreasing current as the battery depletes.

I've never been exactly sure how specific the term "buck regulator" is. You mean the kind with an inductor? Those have a minimum dropout voltage right? This setup doesn't. This is more like a charge pumped "buck regulator", except without the capacitor and associated charging losses, instead replacing that with possible inefficiencies from the LED pulsing as discussed, and regulating on something other than the output voltage directly as most buck regulators do, and on the time average not instantaneous value. So the net effect is kind of similar to a switched buck regulator I suppose, but it doesn't really work the same other than both having a switch.

As for the sense resistor, it's already there. From comments elsewhere it seems the MOSFET is probably about 10mOhms, so why add another 1mOhm.. just sense across the MOSFET.

Like this https://www.fasttech.com/p/1114501 .

The simplest buck reg is just a switch, cap, diode, and inductor, with the controller turning the switch on/off. Vin > Vout to stay in regulation.

The cap smooths out the output voltage, so the LED sees a DC voltage with some small ripple. It’s not pulsed.

Using the resistance of the FET is a Bad Idea, as the range can be >100% variation across parts, and of course varies with applied voltage, gate drive, etc. And it’s used AS A SWITCH, so current will be discontinuous, near 0 when off, and higher than the LED current when on. You NEED a separate external sense resistor to sense LED current.

At least that’s likely the best way to get clean and efficient regulation of an LED’s current. Minimal switching losses in the FET, smoothed DC to the LED, only enough voltage applied to drive the LED at the desired current.

So in other words, it works nothing like this. I can't imagine that thing having internal losses that aren't significantly higher than the MOSFET, nor it being able to pass the same voltage through (which effects how long it can actually regulate). Yes, your cap smooths the output AND it has charging losses (why wouldn't all PWM drivers use a cap?) So, as said, and as realized the whole time... the discussion shifts to the losses due to pulsing an LED instead of the losses and issues of a true regulator of one kind or another. It's just a different scenario.

I get your point about part variation in the FET. As for needing to measure when it's on.. well, that's true of any true PWM design. You have to measure voltage during the pulse. You're stuck on the idea of a true regulator design with a buffered out, which this just isn't. They aren't the same. I think you're point is maybe that a regulator is just as good. At the very least it's different. It probably won't stay in regulation as long.

What I meant was that the 3.80V number I quoted was the wrong number to use. A close to full battery at 4.15V, with 0.03 Ohms IR, would sag to only 4.09V under 2.1A draw, which means the power dissipated by the linear regulator would be (4.09V-3.27V)(2.1A)=1.7W.

Hm? The switch is a FET.

Caps with low ESR have few losses. The FET is “saturated” or off, again, few losses.

The reason it’s not usually done that way is twofold:

— the cap increases costs marginally (pennies) vs just throwing pulsed DC at the LED, and

— LEDs exhibit color-shift with increasing current (PWM keeps the same color at the expense of efficiency).

If I understand correctly your proposed design, you’d have a DD FET not being kept on 100% as a real DD driver, but PWMed to regulate current. That’s fine. That’s essentially the same design as a 2.8A 105D minus the 7135s. The main difference being, rather than limiting the pulsed current to 2.8A, you’re slamming the LED with all available DD current (dependent on battery type, state of charge, thickness of wires and springs, everything), and PWMing it to get ratiometric modulation.

Ie, instead of having 10% of 2.8A for roughly 280mA to the LED, you’re chopping the 5.9A, 7.1A, 8.5A, whatever, that the battery and thick leads and whatever can supply, to 10% of that maximum-on value. And even that PWMed current would drop as the battery depletes.

Yeah, there are multi-mode DD drivers at FT which do that, too.

Now, if you want to sense that max-on DD current to regulate averaged current to the LED to known values, have at it. You’ll still need an external sense resistor and won’t be able to use the Rds(on) of the FET. Don’t believe me. Go’head, try it.

The standard buck-regulator design (with cap) will be a cleaner and easier design by the time you’re done.

Like I said, don’t believe me, but have at it yourself and find out.

@Easyb, ok, misunderstood.

@lightbringer

There's a big fat wiki around somewhere, here it is:

http://flashlightwiki.com/Driver

that says buck regulators are not suitable for single cells, and I think that's mostly right. If not we better get someone to change that wiki.

Here's one I found with actual data:

https://www.fasttech.com/products/0/10001753/1127409-3v-8-4v-5-mode-0-7a-led-flashlight-driver-circuit

You can clearly see by the output current data that it's already gone out of regulation by 4.0V, probably higher! For a single cell, light, that's not really controlling anything. At hgiher voltages where it's actually regulating, it's also impressively inefficient, more so probably than running the LED at 6A pulses is, even assuming 6A efficiencies.

There are two things in common use, linear and PWM (and linear with PWM). I didn't cause that Lightbringer, that's what a bunch of people have already worked on, built and what people buy. Buck is relatively rare as far as I can tell for single cell setups. I'm not proposing chopping 6A signals. That's already done, often (yeah.. there are, as you say). My idea (and Easyb's first apparently) is simply to control that process (slow control, not fast regulation) to maintain steady current as the battery drains.

Regarding MOSFETS, I don't have experience measuring resistance of them but actually a little searching seems to indicate this is a very well specced property at least of any particular FET and gate voltage. I can't understand how in the world it's harder to sense voltage across that than across some much smaller resistor you want to put in series with it. Why? It still needs a board mod.. although one that could basically be done by hand to an existing board initially.

But I like option 1 because it's only software. ok, you need different software for different LED's. Big deal. Probably every XML is close enough. Make one for XP-G's too and you've already made 95% of people happy.

Anyway, I'll believe linear might still be better.

I think something about this all has gotten lost in the fuss.

The real issue here is as much the low modes as the high ones. People use PWM FET drivers for hotrod lights, but the reason it has a driver at all, and not just two wires, is because they still want to be able to turn their light on to say 0.5A sometimes. What this mod does, is just make sure that 0.5A is still say 100 lumens when the battery is down to 3.7 v as it was at 4.2 So when you're not hotrodding, you can always get a predictable brightness. It's not all about efficiency. Of course yes these tripple circuitry boards can also do that part of things regardless of which is more efficient. On the other hand, you can do this method with existing, very simple, drivers that you can actually purchase now(using option 1).. and a bit of software (but it doesn't write itself and no, I probably won't be doing it, just an idea).

Given how Moutain Electronics will program almost any firmware under the sun per order, I can't see how they would be intimidated by having a version of a firmware like this for a few different LED's, thus allowing option 1 simply by emailing them the software no? In that case there is NO hardware change and the only thing this does is fix one of the most commonly stated negative aspects of one of the more popular types of drivers. It just doesn't seem like a big deal. Maybe the new Texas tripples are better. I can believe that.

Change it. You will never see difference between fet and good buck with empty cells (when battery voltage is lower than led Vf on listed current). Voltage drop will differs by several mAs.