With regards to your earlier statement about 0.25 lux - this is the industry standard illumination level to calculate ‘throw’. It is approximately equivalent to the brightness of full moonlight. This is essentially worthless at distances greater than a meter or so, as the light needs to reflect off a target and back to your eyes.
I think the following is correct, but I struggle with maths!
According to your real world data, I’ve calculated the lux falling on the target at the maximum distance you can see it:
Klarus XT11GT = 24964 cd at 1m = 2.5lux on the target at 100m
Fenix HP30R = 10268 cd at 1m = 1.27 lux on the target at 90m
Unless I’ve mucked up the calcs, these values seem to vary too much.
I speculate the following:
1) Either quoted manufacturer values are wildly incorrect (unlikely from these brands)
2) Subtle change in weather conditions between testing the lights impacts visibility
3) CCT or CRI significantly impacts the distance the marks are visible (unlikely, especially as I believe they are both ‘cool’ white tint)
Using ‘worst case’ value, to get 2.5 lux at 200m requires a light with 100kcd minimum.
The Nightseeker NS22 with 4000k ST-20 would hypothetically illuminate the target at 189.7m. (4000k is more ‘neutral’ than, in my opinion, overly warm 3000k).
It’s not been mentioned yet, and I’m sure you’re aware, but for other readers of this thread- I would caution against mixing cells between a 2*18650 headlamp and a 1*18650 lamp - the cells in the headlamp should be the same voltage and capacity.
Edit to add: I forgot to mention that for ANSI ‘throw’ specifications that manufacturers list, 100kcd is equivalent to 632m of throw.