Well I certainly know that 100% FET modes at 6A (18650 and XM-L example) are far less efficient in terms of lumens per watt than linear regulation at 2A, so ok.

Is it really clear though that an FET with PWM creating 2A average from 6A pulses is definitely less efficient than the 2A linear regulator would be? If there's data you can point at of the top of your head I'd appreciate it (that's not meant to be argumentative, I'd just be interested).

One thing that seems very clear to me is that improving thermal performance has a huge effect on reducing the efficiency turn off at high current. 2A average at 4.5 khz pulses produces the same temperatures in LED *I guess* as 2A constant, no? So I guess I was hoping 6A with 33% PWM would be quite a bit more efficient than 6A flat out is. Flat out it seems like it's something like a 25% hit compared to 3A (50% of the extra 3A though).

I'll believe the experts. I guess I'm just wondering is this intuition or is this really clearly known fact that linear regulation wins? Of course it depends on the mode. At very low power, pure FET PWM stays equally inefficient at worst (no? still 6A pulses), while linear gets worse (must shed more volts, if we're talking pure-linear low power, no PWM). I guess at 3A in this example, it does seem like a potentially close call at least.