Branded flashlights way out from specifications

I just want to discuss about branded flashlights that are way out from manufacturer specifications.
For example Acebeam L18 and Nitecore P30 new. I know that there is too many of them but this kind of ‘’ultra branded’’ flashlights and more expensive with the price they should not allow themselves this things.

Acebeam L18 is rated on turbo 1500 lm and I measured only 1050 lm, high is 750 lm and I measured 657 lm.
Nitecore P30 new is rated on turbo 1000 lm and I measured only 807 lm, high is 400 lm and I measured 390 lm.

Convoy M21A with KW CSLPM1.TG is 1087 lm that I measured for example.

Comparing Convoy and Acebeam that have same LED, performance are very similar ( same for me on 200m distance ) except that Acebeam is way abnormally hot just after a short time 1 min, it is extremely hot for that such small kind of time.
Convoy is cold for that time.

This Acebeam is way out of specs and it is hot like as hell, ok I know it is not only this light and there is a lot of more, but I think for this kind of company like is Acebeam this is unacceptable.
Nitecore also disappoint, much less branded flashlights have more precise specifications…

Question is, this branded flashlight manufacturers mush have an integrating sphere and must do a mesurements for light that they produce, of course I do not expecting that if light is rated on 1500 lm all lights that they produce be 1500.00 lm… it is normal that there will be a little tolerance, but this so much less than it is writing I can’t understand what they do here and why not write a realistic numbers…

I think you will find a lot of it is marketing. The FT02s astrolux is rated at 11,000 lumens where the mateminco equivalent is 13,500. This is by the same manufacturer. I don’t think the numbers are to be taken 100 percent seriously.

Yeah and Astrolux numbers are historically inflated to begin with. But he was talking about brands that are supposedly more reputable, Acebeam and Nitecore.

Exactly that. Manufacturers that are top notch in this field, and that are very expensive.
I do not complain if Ultrafire or Trustfire or some other write things out of specification.
But this one companies that have all equipment for measuring and testing, I can’t get the point why they are doing that, that is not thing that no one can check.
Sofirn for example have much more precisely specification from this ones, and it is 3 times or more cheaper…

I’m glad you mentioned this. I’ve been reading/watching tons of reviews lately and it doesn’t seem like any of them rate their light fairly. Extremely annoying
There was one, can’t remember the model, but was rated at a continuous 25,000 lumen and actually did 31k which was super impressive. (BLFGT4)

Cheers

What sphere did you use to measure the lights?

My own. Lights from Maukka.

All is OK, since I measure other lights and that readings are good, this lights that I mention are just way out of specs and it is shame for manufacturers that write higher numbers for high cost flashlights.

Since the Osram WFs are fairly inefficient in terms of lm/W, cell health and clean connections actually are a consideration. I imagine if OP has a sphere and calibration lights from Maukka they would be the type to think about those factors though.

In regards to the lower than rated output, unfortunately there is still a lot of variation in LEDs - even within the same bin - and manufacturing/assembly mistakes could cause issues too, like a substandard flow of the emitter, poorly performing chip on the driver, etc. I think at the prices these brands sell their lights expecting a test of whether they hit full rated output on every unit is unrealistic and I don’t see a solution (other than underrating output, which no brand will likely do willingly due to competition). Numbers on the box sell lights for big brands unfortunately.

What would interest me if how the runtime of lights only hitting like 80% of rated output compare to what’s listed. If it’s longer, that’s at least a trade-off vs simply underperforming.

Your numbers on the L18 are not far off. I didn’t get anywhere near 1500 lumens, even at turn on it was under 1300 lumens. Throw was in line with Acebeams numbers though.

Manufacturers test lights on their own terms with their own equipment and variations are to be expected. Even Olight, Fenix, and Imalent are off by 10% sometimes, but their numbers are pretty close.

Bottom line, don’t expect your as-tested numbers to match perfectly.

Sony VTC5A and Samsung 40T.

I have two of Acebeam L18 and three of New P30 and they are performing same.

I don’t expect if lumen rated on light is 1500 lumens that number must to be 1500 lumens, but 1050 is not acceptable.
Also 200 lm less for Nitecore and that brand is not ok.

L18 is toooo hot after just one minute of use, they totally missed everything.

I agree to the point that it would be great to have something like a "gold standard" for measuring luminous flux and peak intensity, like measuring runtimes i.a.w. ANSI/NEMA FL1. Since every manufacturer uses their own equipment there is no telling what numbers are accurate and what are highly inflated. Maybe Maukka could send a calibrated S2+ light to Acebeam, Olight etc. to help them agreeing upon some standard. But this is wishful thinking and especially the so called premium brands will not listen to us flashoholics anyway (my personal experience, of course).

Concerning the L18 getting too hot: Well, their L17 acts the same way. To my understanding Acebeam uses a less strict thermal regulation allowing a higher offset from the set threshold (55°C) when starting turbo mode. Eventually, the light gets really hot in the beginning (67-70°C) but then settles at 55°C if left running as is.