I get what you mean. I’ve been walking through meadows and along the edge of the woods at night lately. All the different plants and grasses at different heights with scrolling shadows look richer with a warmer tint. I don’t know how much CRI plays a role. I’d have to check back to back but I seem to get the same effect with XHP50.2 and XHP70.2 at 3000K as I do with SST-20 at 2700K.
If something appears or looks different at different times and in different circumstances, how do you know which one is the real one. We know an apple is supposed to be red but if one led shows it as bright red and one shows it as a duller red, which one is closer to reality?
Even two people using the same light source will see the apple differently.
My inbred kids won’t be able to complain about always having chicken for dinner anymore, now they can choose between low CRI chicken and high CRI chicken.
That is 90% the tint and/or CCT of the two lights making a difference, and the other 10% being the high cri led producing more depe reds.
CRI is such a limited way to objectively measure light quality. R9 is just a sliver of deep red taking from a broad spectrum of light— or rather one single color sample. Just because a light has -R9 on its CRI graph doesn’t mean it can’t produce any red content whatsoever.
I bet if you compared a high CRI light with amazing R9, but green tint and high CCT, and compared it to a lower CRI light with pink tint and warm tint but only so-so R9, you’d probably walk away preferring the lower CRI light.
And on the topic of CRI as a whole— it’s a subtle quality to light. It is miniscule compared to CCT and tint. The only real way to accurately qualify how CRI affects light is to pit two light sources of the same CCT and tint against each other, but with varying CRI levels.
Hey, I’m happy to report the high-CRI does seem to make for better photography! The difference is more apparent through my phone camera. Combined with the TIR optic I’m able to get some good shots for my QC dealings.
Thank you, I completley agree. That is the ONLY way to show the benefit or lack of benefit of CRI. Eliminate all the variables except CRI & let the chips fall where they fall.
CCT perception is mostly a matter of raw lux, at low lux, low cct is better.
The sheer amount of lux needed for lights over 4500k to be perceptually good renders anything over that pointless in a flashlight. 6500k is for intensely bright (500-1000 lux) corporate lighting in offices etc…
A lot of CRI differences come right back to preference of tint and eyesight that is able to tell the difference.
Much like the White And Gold Or Black And Blue dress thing, we all see things a little different and this is a good thing,
it is what makes us perfectly made imperfect Humans.
Peace out.
I could never understand the dress thing. Always wondered wtf’s wrong with those people who saw… whatever was the wrong color combo.
Then again, I wondered if other people saw things the way I would see things, colorwise. Eg, what if I saw a woman, and she looked normal to me, but show her to 4 other people, and they see “in their brains”, something like
The highest CRI LED (I believe) is the 6500K Nichia Optisolis. I have them in a triple S2+ and it is very unique. It has a bluish tint, like most of the cheapo flashlights that you can buy, but renders colors amazingly well. And I find it interesting that the lower CCT Optisolis LEDs actually have less CRI.
This is the most important point of the discussion and I wish it was highlighted more often. CRI is a comparison to a blackbody OF THE SAME CCT. As soon as you change the CCT the comparison is largely meaningless. To say I like a high CRI 4000K over a low CRI 6500K says far more about what CCT I prefer than the CRI.