MTN Electronics: LEDs - Batteries - Lights - Chargers - Hosts - Drivers - Components - 1-Stop-US Source

Twice as much runtime excluding the “higher” / 255 mode I assume? If the ZL gets twice the runtime in that mode too then the problem is certainly not PWM. If there’s a link to comparable or related ZL data that would be useful to me. I’m not well versed in the ZL products. (read: I know nothing about them)

Right, sorry. I was looking at the spec page for the ZL SC62d:
http://www.zebralight.com/SC62d-High-CRI-Daylight-tint-18650-Flashlight_p_135.html

It was the most similar model I could find. Of course, it also doesn’t have the problem mine has with losing ~20% due to a too-high reflector and another 5% due to diffuser film and probably another 5% due to lower-quality optics. At this power level, I should be getting 450+ lm OTF instead of 340. (I have another 219B with better optics getting 400 lm at 1.5A, so this 1.9A light should get at least 450 lm)

The ZL gets almost twice as much runtime on its highest mode though… even accounting for mine being like 500 emitter lumens and ZL’s being like 350, it’s still more efficient. I think they might do something clever with a boost/buck driver to convert extra volts into more light instead of just letting it burn off as heat.

That is absolutely insane!

Hearing all this talk of triple xpg2’s creating to much heat, well that’s nothing compared to this!

While using a set-up like this in standard temperature and conditions (25 degrees C) how long could this remain in high for before heating up to unbearable temperatures? 15 seconds? By the looks of HJK’s discharge graph the 20r would only be able to sustain that current for 2 minutes at best, down to 3.6v.

My main goal right now is to have the brightest, smallest light one can possibly come up with, even at the expense of practicality and real world usability, and it looks like this is it. I may possibly be ordering parts for a build like this in the coming days :slight_smile:

Basic maths you say?

- 2,000 is 1/3 of 6000.

  • But 15 mins is 1/4 of 60 mins.

Either something not very basic is in those efficiencies you mention, or you meant 20 mins? Or I missed the tongue in cheek…

Id also like to know how the low modes on the BLD17DD are determined. For example the lowest mode is 2. Is that 2 of whatever the max amp draw is? For example if the driver is giving 11A on turbo than the lowest mode would be 220ma? Or are the low modes pre-determined by a set value?

Thanks for the link.

Can they really attain 3hr at full current? That seems like it would be ~13.44Wh at the LED based on info from djozz. Allowing for 90% efficiency you get 14.93Wh. That’s more than any 18650 holds I think.
Oops. When I wrote the above I thought the flashlight used a Nichia 219B. That could be a major hickup in this comparison. I don’t see an 85CRI part at a glance (probably looking at the wrong datasheet), but I do see an 80CRI part that can supposedly do 300L at 1.0A at 2.86Vf. I suspect that using Luxeon T has as much to do with their runtime figures as a fancy buck/boost driver might. FWIW if Vf is as low as Philips says then ZL would use a straight up Buck driver!

Looking again, I do not think ZL is correct about using a Luxeon T. More likely they use Luxeon TX # L1T2-5085000000000 since that matches the spec they give. Vf is still low, they’re still running at close to 1A (a little over it seems) - an efficient buck circuit should give them the runtime they claim?

Shortly after initial turn-on with your light you should be looking at 80+ efficiency. Efficiency gets higher as Vbat drops w/ the linear driver. So like 90 before an NCR18650B is half empty. Since this comparison is a long ways from apples-to-apples it’s hard to say how they compare. I would expect the linear setup to do well.

Lol, wondered when someone would catch that!

It’s been a rough week, nothing is basic anymore. My on-the-fly math is seriously discombobulated.

But you DO get the picture. :wink:

If I understand correctly, the BLF17DD uses PWM to control its output. So, the 2% should be a 2% duty cycle at full power, meaning it would actually be 2% of the maximum lumens, assuming perfect heat sinking.

We don’t have perfect heat sinking though, especially at 11A, so in reality the 2% value is 2% of something the light won’t actually be able to do —because the 100% mode won’t stay at 100% for more than a split second before it starts to droop. The modes should have linear performance though, until it gets hot enough for thermal sag or battery sag to kick in.

In any case, if the maximum mode is 3500 lumens at start (even just for a split second), the 2% mode should be 70 lumens. A 25% mode would then be 875 lumens. And the 100% mode would start at 3500 lumens then drop like a rock. In a 1x18650 tube light, even 25% of 11A might have pretty noticeable thermal sag. A maximum of 11A is a lot more appropriate in a big host with lots of heat sinking.

If it were a true current-controlled driver like led4power has been making, the performance would not be linear. 2% power would probably produce more like 5% brightness, according to the output curves measured by match and djozz. With true current control, overall efficiency is generally highest on the medium-to-low modes, and the high/turbo modes and moon modes get less and less efficient the closer they get to maximum or minimum power. This effect is also somewhat true with PWM-based lights, but it’s a lot closer to linear than a current-controlled light.

That’s a solid explanation. If you aren’t already familiar with the concepts you may have to take a step back and read it twice, but I think you won’t get a better explanation than that.

EDIT: I guess the only thing not explicitly covered in ToyKeeper’s explanation is that the DD drivers are not regulated and what affect that has on the lower modes. If a fresh cell gives you something akin to 220mA in 2% mode (all made up numbers), a half discharged cell will give you less in 2% mode. The driver is literally doing it’s best to connect the battery directly to the LED 2% of the time, it doesn’t care what results are produced by doing this. So as the battery approaches empty the lowest mode will be considerably dimmer than when the battery was fresh.

I mean, I understand how the PWM works, but we don’t set the driver to work in percentages. We set by 0-255. Percentages are something we RELATE to the result.

Edit: The setting of 2 in the UI will always produce the level of 2 in the 0-255 scope of things. The Percentage will be on a constant fluctuation as the cell drops and output falls accordingly. So thinking of it in terms of percentages is misleading, as the percentage (except 255 or 100) will be on a constant curve. Even that 100 will be dropping, but 255 will always be the maximum setting, whatever is available. This is why it’s difficult to nail down a set number. There isn’t one.

Keep that train of thought going. It looks like you almost got there, but not quite. I think your post ends up more misleading than enlightening.

If 255 is always the maximum, then 50% of that is always 50% of that. Same thing applies to 2% (~5/255). We can call it a percentage or a fraction, but they mean the same thing. A percentage of the available whole, where the available whole is whatever the battery can currently deliver.

And the engineer went on walkabout.

I see where you’re coming from though. This constant variable is one of the reasons the whole “lumens” race is so out of place. The standard is to measure lumens at 30 seconds, and for large cells that is fairly reasonable. There is a climb when the light first comes on then it finds stability for just a moment, the lightbox shows some funny things. Constant variation. Voltage sag, thermal sag, Vf rise, driver inefficiency, all kinds of things affect it and it’s different in every light, from cell to cell.

But yes, I see the thought process, even if it’s flawed. 255 is 100, but 127.5 is not necessarily 50. It’s not a linear curve and it’s constantly variable due to all those issues. If you want to set 3 modes, Level 3 being 100% and each level below that a halving of intensity. It might stand to reason that you could set it up at 255, 127.5, 63.25 (assuming it were actually possible to use fractional settings) It might appear by the numbers to be an equal halving of power, or a doubling as you go up from the off position towards 100. But it doesn’t work that way. Try it and see. The lightbox will show you that it’s not a linear scale. You might set the UI to be 25 of the whole, but the actual output will not follow that logic. Inefficiency of the PWM cycle, driver, whatever the reason.

It COULD happen that 127 might at some test point give you 50% of the full output, but it would be a point on a curve, not reliably duplicable. And this is why book sense is not always or even seldom good common sense. The devil is in the details.

And it just makes it all the more confusing for people like me that don’t have the background in electronics, no base understanding as it were as to why any of this works.

50% duty cycle. Not 50% output.

Is duty cycle “on” 50% of the time? It’s all so misleading and confusing. lol

So this new current regulated LD1 driver is much much better because it’s not using PWM, but instead actually using a lower current to reduce output, right? So the emitter is seeing just the lower current, not a cycle time of full power, thereby much more efficient at lower levels. Right?

Which is, of course, a whole nother can of worms.

So why would medium, at a 39 of 255 PWM, have given ToyKeeper the best efficiency?

That’s all correct.

Good question. There are probably a lot of factors. There may be something inductive happening, smoothing drive current a little. I really don’t know the answer though.

I guess this all wouldn’t be so much fun if it were easier, huh? :stuck_out_tongue:

A sword with two edges you say? :-p

Someone sharpened the handle on mine!

So on time, duty cycle, and actual photons being emitted are closely related but do vary.

On-time is normally measured from the time current begins to flow to the time the current stops flowing. That is to say whether light is being emitter or not, if current flows, the circuit is “on”. With PWM, there is rise time and fall time. Part of that time, the current is less than required to emit photons but energy is being utilized.

Duty cycle calculates the total energy being consumed in relation to some fixed “maximum”, normally high mode or direct drive. Most of our lights never truly run direct with PWM for all intent and purpose here, whatever current you draw in high mode is considered 100% duty cycle for that particular torch. When the light puts out less lumens, then the current draw is less and can be used to calculate total runtime. This is easy with constant modes but a lot more challenging to maintain efficiency with blinkie modes. You need a way to calculate the average current draw to do the calculations, or just run the light until the torch goes into foldback (the driver protection kicks in).

When does the emitter actually emit photons? The diode part of the LED is active a voltage is presented to it and current flows. But you can have on a few milliamps flow and generate a little warmth, but no detectable light. This is worse in blinkie mode as parts of the rise and fall time will be in this range. If you have a moonlight glow, that means you are dipping into the non-emitting range with each pulse of the PWM.

ToyKeeper’s number above are interesting. Exactly what I would expect. a lot of “dark” current in the low modes. The most efficient modes are near the LED’s intended range. Intended range here I mean heat saturation. If the pull were supercooled, the output lumens wouldn’t begin to sag that much as the output was increased. You can back-calculate the performance of the heatsink with good lumen curves like this as well.

The analogy is much like the stove. It can burn a lot of energy and never “look” hot. But you know you can pump more power into the coil, and it will glow red. Led’s are not much different. They generate heat… and photons as a byproduct.