Well, not really. They acknowledge that just crowbarring an LED across a cell can result in too-high current, pushing the LED into a low-luminous-efficiency point. There’s little dissipation in the “driver” (wire), but it’s Hell on the LED, and hardly efficient as a system.
Even PWMing it with a bang-bang FET ain’t a picnic, either, ’cause each current spike will still be Hell on the LED.
And trying to “direct-drive” a red LED, or worse, an IRLED, with a right-off-the-charger Li-ion cell will glow red all right, at least for a short time, but probably from incandescence.
Whereas a UVLED with relatively high Vf would be perfectly workable.
So youse are both right, but arguing that the other’s wrong.
Which can be pretty efficient (most efficient, even) if it’s matched to the load. If not, then the overall system efficiency would stink.
Ie, you can’t take a 100W 120V/220V bulb, crowbar it across a Li cell, and get it to even glow. It might be the most efficient driver to dump its current into the load, but without even glowing a dim red, it’ll stink as a lighting system.
Whereas an LFP cell matched to an LED which’d put out decent light at exactly 3.2V would be an excellent DD solution. In fact, that’s what newer “solar lights” use, vs NiCd/NiMH cells and boost-drivers.