There is a big spike at 450 nm. I know WHY this is - because they start with a blue LED. But is that why the r12 value is low? How can such a large deviation only result in a couple CRI points lost?
Related question: How much does it matter that virtually no white LEDs have proper extension into the 700 nm+ range?
Not much I’d say, what the data shows is that blues are more saturated and tend towards purple (color vector graphic), in reality I think this is difficult to see.
To help verify the CRI of your particular flash, all ya have to do is get the Ozark Trail OT 50L at yer local Walmart for $1.00 as stated in my signature.
This will tell you if the TCR matches the CRI that you thought should be there.
Nothing can better define your particular unique flash’s CRI expectations than reality.
TCR is a subjective rating to a particular flash. Every flash is different. Hence every emitter is different. Granted, web-consensus can then become a strong or even better indicator of production consistency. This is where TCR fills a gap that CRI can’t provide.
If you think your eyes are inferior to CRI then ya shouldn’t care. Just buy watt the CRI sez and if you’re unhappy with watt ya got then ya can blame CRI, right? :student:
Looking a a few 730nm LED Datasheets a part of their spectrum are in the Visible range which would be what we would see, just like an UV LED also emits in the visible range.
Depends on the lighting and tint bin. The sw45k seems fairly tightly binned and most people would say the SW45k is pink, but it is blue compared to warmer lights or at night. I have had 3 bins of the SST-20 4000k and they range from yellow-green to white-blue-pink in tint.
Interesting. Our Bundesamt für Strahlenschutz BfS gives a range of visible light up to 780nm. Is this a mean/max thing? Or don’t I understand the photopic luminosity function (likely )?