CRI and CCT

There are many charts on this site showing the color rendition of various lights. Lots of numbers and graphs that I just don’t understand. I think I understand CCT and it is pretty basic and easy to spot. 6000k is white almost trending blue. 4000k is warmer trending yellow. 2700k is warmer and more yellow etc… Pretty sure I get and can spot the warmer vs cooler CCT.

Have been playing with an SP10 Pro with the LH351d 5000k CRI 90 and the EO2 II SST 20 4000k CRI > 95. My understanding is that these are both high CRI lights and should both represent colors accurately. I don’t necessarily have a great eye for color but am definitely not color blind. It is easy to see that the 4000k SST 20 is warmer than the 5000k LH352d. What I don’t get is if they are both supposed to be highly color accurate, why is it that I can see a difference in the colors reflected? How can they both be accurate if I can notice a difference between the two?

You need to understand that CRI is an objective comparison between a certain CCT and a theoretical idealized illuminant of the same exact color temperature. So in other words, something that’s 4000k should only be compared to an illuminant that’s 4000k, and something that’s 5000k should only be compared to something that’s 5000k. This is why they can both have the same CRI but look drastically different.

Plus, they can have differing tints too. In order of how we perceive light, we distinguish color temperature first, then tint second, then CRI last. CRI is rather subtle compared to color temperature and tint. Color temperature, put vaguely, is the balance of red to blue in a light, whereas tint is the balance of green to magenta.

CRI is as measure of color accuracy as measured against an ideal light of the same CCT. That is to say a 4000k light is measured against an ideal 4000k. That explains why two lights can be high CRI and be different in color rendition. Worth mentioning that there is no universal right way to look at color. Colors will and are expected to look different at different CCTs.

My eyes/brain easily register the difference between a 4000K and a 5000K led.

I doubt my eyes/brain could tell a difference between two led’s of the same CCT if one was rated 90 CRI and the other 95 CRI and all other factors were the same.

> why is it that I can see a difference in the colors reflected? How can they both be accurate if I can notice a difference between the two?

LEDs are not “accurate”. They have a lot of great qualities, but they are in no way identical to sunlight, nor incandescent.

Thats Great that you are making your own observations and noticing differences.

LH351d has lower Red CRI R9 than SST-20 4000K

look at the red bars:
.

CRI is made up of Fourteen Separate Colors, and the SST-20 has higher values than LH351d

Red is one of the hardest colors for an LED to make, and it lowers the maximum ouput, so LH351d can claim “High” CRI Ra, without having as High CRI R9 as SST-20

more info in this thread

The general problem is how to compare color? Solution: You compare it to heated glowing stuff!

CCT color represents the black body line:
That is eg a chunk of steel heated. It glows from Red, Orange, Yellow , white, …… The Temperatur in Kelvin is the CCT like 3000K

Now you take the 3000K hot steel, hold it in front of a color test chart.
Now you take the 3000K rated LED and hold it in front of a color test chart.
The difference in color perception is the CRI rating. (In reality you measure the spectrum / frequencies of the light)

You can do this with 6000K which is bluer light instead of orange, and the test chart looks different. But the 6000K LED can have the same CRI

CRI is always for a certain CCT.

Black bodies is the easiest way to reproduce color: simply heat it up to temperature, that worked in 1800

I have a flashlight with the highest CRI that is currently available in an LED, which is the Nichia Optisolis SM653-P9-Rfa00 6500K LED. Seems strange that a bluish 6500K LED can have such high CRI, but I’m alway amazed at how well it renders colors every time I use it!

I prefer lower CCTs in the 4000K or lower range, but as far as very high CRI is concerned, it’s hard to argue with the following chart:

First you see tint

that’s the blue,green red,beige,orange etc

that’s your first impression on a white wall/ceiling etc.

You can translate those colors of tint to temperature pretty easily .orange being warm and blue being cool. Green being above the line and red being below … all temps or tints or hues .

High cri is found when you take any tint or temp and point it at something which has 10 or 15 multiple popping colors . I have a collection of books that each has a different color . Also a collectors guide to items which has a hugely colored cover .
Although tints are different on different temps each high cri one makes the colors pop . Where as any low cri emitter makes them seem dull lifeless and odd.

This is why people hate the Samsung hi cri’s that are notoriously green .. green tinted . Or some of the sat 20 4000k or a certain bin of the nichia 219c 4000k

Like it or not there really is no perfect tint hue or bin . It’s all subjective . It’s like having an ugly girlfriend .. how ugly is too ugly ?
Every new light I buy gets judged as to whether the tint either demanding an emitter swap or or is within a tolerable level

I sell and am happy to us a small amount of tint correction on let’s say a sst20 with high cri and a slight green tint . Being able to knock out enough to put the beam into tolerable and now I light that doesn’t make me actively want to avoid it . Win win