Can the DD FET drivers be used for higher voltage?

I was looking at driving one of the big arrays from cree, but I obviously can’t just slap 36v through a driver rated at 3s lion voltage, so I was wondering if I could use the LED+ pad on any of the smaller drivers to run the gate of an appropriately rated MOSFET, almost like a series pass transistor in a bench PSU.

The big MOSFET would switch power from a boost converter. At least In theory, I believe that this should work, giving the versatility of an attiny-based driver to the monster flooders.

Power from the attiny driver would be supplied independently from the boost converter, so it has a nice clean power supply.

Is there an obvious problem with my thinking here? My main concern being the switching frequency of the DC-DC converter doing strange things with the driver PWM frequency. Is that likely to be a problem?

I think so… I have not tried it yet. But there must be either a voltage divider or a regulator for the MCU. It can’t take more than about 5.5Vdc.

I am sure that other more knowledgeable folks will chime in.

The LM2936 voltage regulator can handle up to 40V. It could be used to power your MCU. There’s some info on it in this thread:

EDIT: In addition to voltage regulator for the MCU, you will need to replace the capacitors and resistors with ones that can handle the voltage. I think most of the big MOSFET’s we use can handle 36v, but that would need to be verified too.

Okay, I've got some thoughts about your FET drive scheme itself below, but first to address your actual question, about interactions between PWM control and your boost converter.

If your boost converter is happy to run open-circuit, you probably won't have any serious issues. This includes any constant-voltage regulator, or a constant-current, constant-voltage one operating in voltage control mode. These will have no problem, aside from the usual warnings about thermal runaway.

There's one risk with constant-current converters; the output voltage wants to go to infinity when the load goes open-circuit, charging up the output capacitor (if present), and eventually blowing up the driver. (So if your driver doesn't blow up, it's got some mechanism to limit voltage rise.) But if the cap is charged up to, say, 40 or 50V, then when the FET turns back on, the cap dumps high current for a short time through the LED, possibly enough current to blow bond wires.

So the voltage needs limited -- no problem. If you've got both voltage and current adjustments, set the open-circuit voltage just a little higher than the LED's Vf at full current -- there will be a little extra inrush current, but nothing the LED can't handle. If it's a constant-current regulator with no provision for voltage adjustment, you'd need to add some sort of zener-based voltage limit to the feedback network, or maybe change a zener that's already there. (If you can post a schematic of the boost converter you're using, we can help with this; you probably realize it, but don't just strap a 38v zener across the output; it'll see the full current and burn up.)

Not familiar with bench PSU circuits, so I'm not sure if you're thinking of high-side or low-side control, but since you're hooking it to LED+, I guess low-side n-channel, sourcing current from LED+ to drive the gate on. Also, I'm assuming the driver's B- is common with the boost converter's negative output.

Problem is, most LED drivers I've seen use use low-side control, so the LED+ pad is more-or-less directly connected to B+; L- sinks current through a FET and/or AMC7135s. If you use one of these low-side drivers to drive your FET with L+, it'll be stuck on, because the gate will always be held at the driver's supply voltage. To drive your FET from L+, you'd need a high-side driver, so make sure the driver you plan to use works that way.

But if your driver does have low-side switching, you can instead use its L- to drive a p-channel FET on the high side of the LED -- taking care to respect the maximum VGS. If you go this way, you can drive VGS up to 36V, so no need for a "logic-level" FET.)

As I understand it, your strategy is to use some lower voltage (8-12v, maybe, like 2-3s Li-ion?) to power both the FET driver (which already has a zener mod or voltage regulator to accept that input) and a boost converter; the boost converter output will go to the big LED array with its own FET, and the driver's output will go around the boost converter to drive the second FET's gate.

  • Benefit: no modification to the LED driver
  • Downside: lots of variables:
    • LED driver high-side/low-side
    • boost converter common- / common+ / isolated
    • your choice of n/p-channel FET and low/high-side configuration.
    None of these are problems, you just need to make sure everything matches up.

But like ImA4Wheelr says, you can probably use the FET driver's own FET directly (not to drive another FET), as long as you upgrade the necessary components. With that plan, you'd have the boost converter first, then the modified FET driver and LED array in the usual configuration.

  • Benefit: simplicity
  • Downside: need to mod the driver with new passives, and add a voltage regulator.

Sorry guys I just experienced a doh moment. The LED+ pad is straight to battery positive, so that’s obviously not going to be switching anything . Yes, of course these drivers are low-side regulating, but I think I incorrectly worded my question.

I meant if I had a 5v independent supply for the driver itself, and I jumped a wire from the driver to a TO-220 MOSFET’s gate, to switch power from the 36v supply. I wasn’t sure if I was being stupid and there was anything obviously wrong or not.

The main idea being that nothing has to be changed on the LED driver itself. Only one more component, but not necessarily even on the same PCB.