Sure, no heat dissipated in the driver, but with e.g. a cell at 4.2V, an LED that needs say 3.5V for the desired brightness you are hammering the LED into an operating region way beyond peak efficiency.
PWMing it to lower brightness doesn’t change these facts.
There may be no wasted power dissipated in the driver, it’s instead dissipated in the LED, and the internal resistance of the cell, and other bits of the torch, springs etc.
Direct drive is the crudest and least efficient way of powering an LED, until the cell is discharged to the same voltage as the LED Vf., by which point it is pretty well empty.
It is not obvious how inefficient direct drive is. Looking at my example you might think the system is (3.5/4.2) = 83% efficient, but far from it, you also need to factor in the much higher current taken by the LED at the higher voltage, which you can estimate using e.g. djozz’s measurements of LED transfer characteristics.
The most efficient system is to power the LED with direct current, at the level needed to achieve the desired brightness. LED efficiency drops dramatically when over-driven by direct drive. In the limit you can easily reach the point where e.g. 50% more power gives only 10% more light.
Once you start modding with e.g. spring bypasses, better FETs, in pursuit of headline lumens figures in turbo for brief blasts, you are also further reducing efficiency in normal operation.
Don’t kid yourself, a FET driver is always pushing and stressing the LED at full turbo level (least efficient), and hammering the cell at higher instantaneous current leading to increased loss from internal resistance, even when it is PWMed back to much lower light output.
A linear driver can do this by burning off the excess voltage as heat, whilst operating the LED at an efficient point (generally the lower the current the more efficiently the LED performs). A buck (or boost) driver can do this more elegantly by dropping the voltage without wasting the excess as heat.