Review: 10W Cool White LED Floodlight from Wallbuys

Yep, I guess 15W should be OK and that’s what the stock 3T6 driver does from 4-13V…I have just looked up and xml2T5 does 630 lumen @2A so I guess at4.5A it should be a nice floodlight with 1200lm or so.

Nice hint with the paint, I am going to grind it away…
I will try to solder 2 XMLs on one star like I have seen it here somewhere, maybe I can grind some of the protective layer away…not sure but I will try.
I have some noctigons/sinkpads lying around anyway.

The screws for the mounting piece(not sure how this is named) doesn’t fit the threads or maybe the threads are painted…anyone else had this?

You can mount two XM-Ls on one star, but grinding away the space around it will do no good. There is just aluminum under the dielectric layer.
Two XM-Ls at 4.5A will give more output than one, for sure. At 4.5A into one XM-L, you’d be better off with a direct bonded copper star.

I wonder if there would be any differences in heat output:

  • 1 x XML driven at 3A

or

  • 2 x XML sharing the same star, driven at 3A… so 1.5A each emitter.

Ive noticed that most LED’s seem to run fairly cool until you drive them slightly harder.

There's no aluminum underneath to worry about with a sinkpad/noctigon

That ^^ worked fine, direct drive I think it was around 5.8 amps and made plenty of heat (and a LOT of light). Getting a usable beam pattern with a round flashlight reflector is near impossible but would work great in a floodlight.

What Comfy did should work fine. Now that I think about it, on a regular star it would have heat issues.

Thanks but I probably didnt explain that correctly. :stuck_out_tongue:

Which would give off more BTU’s? One XML driven at 3A or two XML at 1.5A each. Im guessing the first option, but I havent been able to confirm this.

I would confirm your hypothesis, because actually all power which goes in comes out and we all know that higher driven XMLs loose efficiency(if you imagine the current/lumen diagrams you see that it is not linear, incline decreases)
And so if less light comes out there is more heat but I would not overrate this so much, maybe a few degrees less??

Comfy have you ripped the middle layer a bit away?

Actual heat generated by the emitters would be dependent on the Vf. I’m assuming if current is the same, the pair of emitters would have a lower Vf. Therefore, the pair driven at 1.5A each would generate less heat (and more lumens) than the single emitter at 3A. The difference is relatively minor, but probably measurable.

I cut away the top dielectric layer under the outer halves of the LEDs. So there's a thicker than optimal layer of solder under the outer half that connects direct to the base, still better than nothing. The inner halves are sitting on the normal raised pad.

Thanks for confirming, what I had in mind.

No, the power in and dissipated as heat would be the same (assuming the driver power loss is also included). Say the batteries were at 4.0V: 4V * 3A is the same as 4v * (1.5A + 1.5A).

True, if I was referring to total power in. I was only referring to emitter power dissipation. That was the specific question asked, as any driver power dissipation didn’t seem to be part of the question.

So waste heat scales 1:1 with input power even though light output doesn't?

For the emitter itself, the heat/light ratio increases as total emitter power increases. LEDs are less efficient as the drive current increases.

Assuming an external constant power supply, does anyone know if the ratio of heat output (measured at the thermal pads or mcpcb) to input power is linear with XML? I couldnt find anything to illustrate this in the cree datasheets. I dont care about relative fux ratios.

Hypothetically:
If the emitter outputs 2000 btu at 2A, will it output 3000 btu at 3A? I suspect it will output more than 3000, but its just a hunch from all the mods Ive experimented with. For instance, moderately warm running flashlights driven at 2.5A seems to suddenly run HOT when running at 3A. Im fairly certain this isnt just my perception, but I could be wrong.

Watts is watts… so the amount of power dissipated will scale with current. But, heat radiates proportionally to the FOURTH power of the temperature difference.

Pyro makes a very valid point, but I do not think it addresses the question being asked.
Because VF rises with input current, the resulting power dissipation is not linear.
A hypothetical example:
If = 1A, Vf = 2.9V, P = 2.9W
If = 2A, Vf = 3.1V, P = 6.2W
If = 3A, Vf = 3.3V, P = 9.9W
As you can see, power increased in a non-linear way.

Again, since the driver is in the box, you have to allow for its’ dissipation also. Total power you have to shed is volts in * amps in.

The driver’s in a separate box at the back, barely thermally connected to the main chassis. The driver will have its own thermal issues, because of how it’s installed (check my review and mod above to see why).

As a purely theoretical/non-10w-floodlight question, the LED converts input power into one of two things: waste heat or photons. So what I'm getting at is if you hit the point where efficiency starts falling off from the peak, doesn't less lumens mean more of that power is turned into waste heat instead? Like if you had a graph showing all 3, wouldn't the lumens & heat curves start to diverge after a certain point?