I am new here so please forgive my ignorance. Recently I started looking for a good flashlight, got curious about their internals and became very puzzled by the drivers offered for DYIers. Linear drivers based on as many as 8 (!!!) of 7135 chips seem to be highly regarded. Sure, running at a constant (regulated) current is a good feature as it lends to overall efficiency (LEDs are more efficient at lower currents). However, changing current by switching between groups of 7135s seems kludgy. And using 8 components in parallel is usually a sign of using wrong components :). Furthermore, linear drivers are fundamentally inefficient as they simply convert excess voltage (x current) to heat.
So why not use a single MOSFET to replace all those 7135s? But wait, wouldn’t it make so-called direct drive driver? Maybe not. Looking at the DD/FET drivers offered by a reputable sources I see that they are obviously deficient relative to linear ones in several respects. Firstly, they do not regulate current and secondly, they pulse light when operated at less-than-100%-duty-cycle, which reduced overall efficiency and is visually questionable for some folks.
So here is what puzzles me: each of these deficiencies are easily rectified, at least on paper. An inductor in series with LED plus a flyback Schottky diode in parallel with driver output and a capacitor in parallel with LED will convert pulsed driver output into much smoother voltage fed to LED. That will combine higher efficiency due to nearly constant LED current with higher efficiency due to minimal loss in the driver. If desired, one can also make LED current to be regulated (as in independent of battery voltage). That takes a Hall sensor chip or a shunt+amplifier. Basically, what I am describing is similar to brushed motor controller or buck converter.
On a practical side, I admit I have not play with numbers to see how big an inductor would be needed. It may be too big for a small driver board. However, the beauty of this is that it does not have to be on board at all. Circuit-wise it is between driver and LED. So why not position it in that sizeable free space inside the pill? A dimensionally larger inductor would have lower ohmic loss and less heat generated…
More expensive lights, such as Zebralight, use boost drivers in their lights. This allows for greater efficiency, and full regulation of output. It’s great! The downside is that it’s expensive, for whatever reason.
The reason you find linear drivers and FET driver in budget lights is because they’re cheap. You’re right, they are kind of kludgey and inefficient and not very well regulated, but you get what you pay for.
Linear drivers based on parallel 7135 chips are popular because they are dead simple and they work. And also they are cheap. Worst case efficiency is 70% or so, but increases as Vbatt decreases.
Lineair drivers are simple, small, cheap, allows for any user interface, and they always work. Fancier drivers tend to be more challenging to design, are bigger, more expensive, and because they use coils they are less impact-resistant.
One reason that those medieval lineair drivers keep sticking around is that once one with good cool user interface is built into a flashlight, as long as you do not start measuring stuff, the light works like dream, nothing wrong at all to notice!
Multiple 7135 is a cheap and working solution, you can also get linear FET drivers that uses a single transistor for adjusting the current over the full range, it is better, but also more expensive (Because it is made by a enthusiast). Direct drive PWM works, but has efficiency problems at low brightness and is a bit hard on the led. Using buck converter is not perfect either due to the low (or non-exsisting) voltage drop and the diode do not help with that.
The best would be a buck-boost chip with four integrated high power mosfets, but I doubt it exist in a small and cheap size yet.
By “low voltage drop”, do you mean that buck needs some headspace for voltage conversion? That’s true, but the circuit I was describing is not a buck converter, but only have some similarity to it. It is not at all intended to make stable output voltage, independent of load or input voltage variation. Think of it as a regular FET/DD driver followed LC smoothing filter (and a diode to flow the return current). It is exactly like brushed motor controller except the inductor is external to the load.
So it will work fine with any voltage drop just the same as a regular DD driver. In fact, it will work exactly like DD driver when at 100% duty cycle - inductor, capacitor and diode will contribute nothing then, except for a little ohmic loss of the inductor. And the voltage dropped by by the whole circuit will scale with PWM duty cycle, from almost zero to whatever the difference is between batttery voltage and LED’s turn-on voltage.
This also means that there is absolutely no need for multiple mosfets that you mention. In simple words, you already know that there are inexpensive DD drivers that use single mosfets to handle all the current needed to drive LED. That’s what puzzles me: it seems such a simple addition to already existing DD drivers and yet I do not see it offered for sale.
Care to elaborate on why you think that it wouldn’t work too well? Please understand that the circuit in question is not something I invented. It is a trivial way of controlling DC load with minimal losses. For flashlight use, wasting 1/4-1/3 of power in a linear driver may be surprisingly acceptable to many, but in other applications it is not.
It should be possible to use a capacitor and diode to smooth pwm output though.
Like what is done for AC DC regulators, but instead of ac it’s a PWM signal.
there’s the bit about pulsed and smoothed, basically, if it’s only on for 50 percent of the time it runs cooler than being on for 100 percent/constant.
so, as the theory goes, you up the volts and amps, because you can now disipate more watt’s so the lights brighter, or apears brighter, some times works out.
regarding buck/boost and getting what you pay for, flublubluegh ffffff yeah right. there’s a couple versions of this, one just needs absolute belief, blind…where funnily enough the guy needs to make a living yu know, he’s got a fictitious house n kids and big budgie that needs keemo in America for 6 weeks a year, preferably peak season…
and there’s the cheap tat that a law should really disallow it’s practice. yu know, in the bin buy another, in the bin…buy that one in the bin….it’s criminal in so many ways.
du-dahhh, there’s a fair profit then there’s a rodgering without the lube…then again? we could milk these 20 year old chips…
Flyback DC converters are mostly useful at very low-high voltage conversion(5V-100VDC+) since buck/boost converters tend to behave badly at extremely high frequencies.
The difference is that buck converters are just designed to give a stable output voltage irrespective of input voltage or output current. The circuit I was describing does not do that. In fact, the output voltage will depend on output current and input voltage because there is no feedback to control PWM duty cycle. You could however say that this circuit is an intentionally bastardized buck converter.