Exactly, with a fixed voltage, the current will be all over the place depending on temperature, and a very minor change of voltage will lead to a big change in current (and power consumption).
All led fatalities I had were fast (seconds) if they are withstanding the DD means the heat sink is good enough, and the current is ok. And if you kill one in one or two years you’ll be very happy to have a reason to upgrade to a newer model
Doesn’t it also depend on the battery(s) being used? I mean for high currents that suck the guts out of the battery - the battery can only deliver so much…?
Direct drive is not only a fixed voltage and a led, there is also some resistance.
This resistance is from Ri in the battery, wiring, switch, connections, springs and also inside the led. Some people may also add a bit extra resistance with a resistor.
The resistance will limit the current and will often make the led survive direct drive.
It depends on how much the current changes as the Vf changes. I use DC-DC converters that only have an adjustment for a fixed output voltage with LEDs fairly often ("10 volt" LEDs with 12-14v source voltage), and if the LED will survive the current when they are hottest and therefore the Vf is lower and the current is higher, it works fine. The initial setting just has to be low enough that it doesn't go too high when it heats up.
That's the same set of factors at work in a direct drive setup, the LED just needs to be able to withstand whatever the current rises to as the heat builds up. The input voltage is limited by the choice of cell instead of by a voltage adjuster pot, and keeping the LED cooler (direct copper MCPCBs) limits how far the Vf will drop.
Systems with negative feedback tend to be more stable than positive feedback. Most conductors have a higher resistance as they heat up so with a constant voltage source, current will decrease. That’s negative feedback. An LED on the other hand will draw more current as it heats up, that is an example of positive feedback. Left unchecked, and with a supply that can deliver enough amps, can lead to thermal runaway, a result of positive feedback.
I used the circuit ComfyChair refers to and supplied a constant voltage to a project I had. Even though it led to increased current as the LEDs heated up, I was OK with it as the circuit had a built in current limit of 2.8A. So in effect I used it as a constant current source. It was a nice and simple way to go, and cheap!
I have some very last-century LED dropins — in big old M@g D-cell lights — that have four Luxeons, direct drive.
Dinosaur-grade lights.
White LEDs are OK with either alkaline or NiMH.
Amber LEDs used with NiMH overheat and the light gets very dim very quickly.
The NiMH cells don’t “sag” their voltage under load and so overdrive and overheat the amber LED
The alkalines sag the effective voltage fast under the same load.
Batteries are not a regulated voltage source. As you draw more current from them (i.e. LED getting hotter) , their voltage decreases, which causes the LED current to decrease. Which helps (somewhat) to mitigate against thermal run-away in direct drive lights… particularly with lower quality batteries and single cell lights.