Lets Talk Tints

“Color rendering index, or CRI, is a measure of the quality of color light, devised by the International Commission on Illumination (CIE). It generally ranges from zero for a source like a low-pressure sodium vapor lamp, which is monochromatic, to one hundred, for a source like an incandescent light bulb, which emits essentially blackbody radiation. It is related to color temperature, in that the CRI measures for a pair of light sources can only be compared if they have the same color temperature. A standard ”cool white” fluorescent lamp will have a CRI near 62.’

‘CRI is a quantitatively measurable index, not a subjective one. A reference source, such as blackbody radiation, is defined as having a CRI of 100 (this is why incandescent lamps have that rating, as they are, in effect, blackbody radiators), and the test source with the same color temperature is compared against this.’

I found this on a search and cannot verify as to the accuracy.

Yes. However CRI can be a poor standard to base accurate color rendition on.

First, CRI is by definition 100, regardless of the CCT (temperature or tint of the light), if it matches a blackbody source of light (such as an incandescent light). But a CRI 100 with a CCT of 1500K is vastly different than a CRI 100 with a CCT of 10000K. The former is heavily weighted in red light, while the latter is more saturated in blue light. Things are going to look very different under those lighting conditions, and colors will not be accurately identified in either.

Second, the CRI standard uses a limited number of test colors to determine the CRI of a light source. I think it’s around 14, IIRC. Usually, that’s good enough. But, it under-represents deep red as a color.

I really need both the CRI and the CCT (which is even more important to me), when figuring out what kind of LED lighting I want. Tint (green or magenta) is also very useful. I like tints on the magenta side (like Nichia 219 LEDs), not the green side (like most Cree LEDs).

I think that’s sort-of misusing the terminology. Incandescent lighting can be full spectrum, but you’d have to get the filament glowing much hotter than the typical ~2500K. 5500K would get you a spectrum similar to the sun, which is full spectrum. I’m not aware of materials that can take that kind of temperature, but I’m sure there must be some.

But generally, incandescent lighting is very poor in blue wavelengths. It’s there, but not very much of it. That’s why it’s hard to distinguish deep blue and black, or blue and violet colors under standard incandescent lighting, unless it’s bright.

There is much to know beyond a CRI number.

It reminds me of a quote by Denis McKenna: ‘The bigger the bonfire, the more darkness is revealed.’. :slight_smile:

I have never paid much attention to CRI and neutral/warm tints, always been a big fan of CW but after so many XP-L HI V2 1A mods I admit it starts to get boring. Made a triple Nichia 219C and ordered a 5A BLF D80 some time ago but it was definitely too warm for me, it looked “dirty”.

Decided to try my luck again and got a 3C EagleEye X2R http://www.gearbest.com/led-flashlights/pp_411562.html. I was blown away by the tint!! Everything looked so much vibrant and the temperature was simply pleasing to the eyes. I don’t even know what bin this 3C tint is or if it is high CRI of any kind, it looks a lot like my armytek wizzard XHP50 “CW” which is actually more like 5000k, but somehow the tint of this 3C is prettier, and wherever I shine it to it gives a better sense of depth. Not sure how to explain it…

After playing a bit more with the light I decided to built my first ever high CRI light. I’m looking for a emitter which is not warmer than this 3C but with higher CRI. It can be XP or XM as I have both centering rings. Looking for maximum brightness as well with a FET driver so no nichia… Any recommendations?

Thanks!

Cree XHP35 is available with 90+ CRI even at 5000K and 5700K CCT (your 3C is supposed to be between 4750K and 5000K CCT). Note that it is a 12v emitter.

I remember I had one Osram LED (not golden dragon) that has one main white LED and a red diode on the side to compensate the tint

When it’s in moonlight mode the LED goes really pinkish warm but gets really blue when on max 120 lumens

It makes me wonder if adding red would change the perception of fake CRI? It surely didn’t work when I tried an XML2 1A tint with an XRE red emitter but the reddish tint is somewhat pleasant

Which makes me wonder if this is why dive lights needs lots of red in their searchlights

Longer wavelengths (warmer) scatter less light, so are better at cutting through fog, humid air, and water. Perhaps that’s why diving searchlights use warmer tints with more red in the light.

+1

I am too rather new to understanding tints as you can read from my comments on this thread, but like yourself I have grown fond of a 5000K 3C tint. I now know better the reasons why. :slight_smile:

Thanks to jon_slider and WalkIntoTheLight for the outstanding explanations they posted.

  • "Neutral" tints are those found along the black-body radiator line.
  • At a given CCT, above the black-body radiator line you will find "greenish" tints. Below are magenta tints.

Sidecross also shared this gem from FullSpectrumSolutions:

  • "[CRI] is related to color temperature, in that the CRI measures for a pair of light sources can only be compared if they have the same color temperature."
  • Thus, a 100 CRI (Ra) source with a CCT of 2750K will not do as good a job rendering the visible color spectrum as a 100 CRI (Ra) source with a CCT of 5200K.

A good for-instance is the new Nitecore TIP CRI. It uses a high-CRI Nichia 219B emitter. Although there is talk of a 5000K version, for now all that is available is the 3500K version. At 3500K, the TIP CRI can advertise 90+ CRI, but that is not the same thing as 90+ CRI at 5000K.

Unless you’re doing work that specifically requires you to accurately see all colors (such as an electrician), I wouldn’t get too hung up about 5000K temperatures. It’s true that 3500K isn’t going to show you much color that is blue or violet, but it will do a very good job showing you reds and oranges.

Emphasizing reds may be a better thing to do, in certain circumstances. If you’re looking at human skin, for example. Or if you’re walking in the woods. Or if you just enjoy warmer tints because you find them more pleasing and less tiring on the eyes.

There isn’t much blue or violet color in the natural world (except for the sky). So you’re not missing out on much if you use warmer tinted lights. Getting a better separation of colors in the reddish bands may let you see better.

+1

These are pics from a Nitecore Tip and TipCRI

Tip: Low CRI 6000K… look how green the yellow cards look. I dont “see” any advantage in the blue from the Low CRI, and imo it makes the Silver look too blue.

TipCRI: look how much more realistic the High CRI 3500K makes the yellow card. Look at the Red, Gold and Silver too.

What was White Balance camera setting when you took these pictures? It is very difficult to make color accurate photographs under such light sources, i.e. make the photograph show the same colors you saw with your eyes.

I dont know, not my pics, I assume they are from an auto white balance cell phone
you could ask the OP here
in my limited experience, auto white balance works well enough to show the difference between the two beams, particularly when you have all the colored items to compare in the photo.

Is there any doubt from the pics that the Low CRI is doing a worse job of rendering color?

There is real doubt these pictures show colors how they were seen by the human eyes. Just a note, setting “correct” white balance in camera under such irregular light sources as LEDs is not enough either. One needs to take a picture of Color Reference Chart under the same light and then adjust colors in post processing. When working with light sources which follow Black-body radiation curve more or less closely, like Sun or incandescent bulb, selecting correct white balance setting maybe sufficient. But when working with light sources like LEDs, whose spectral energy distribution curve looks nothing like Black-body radiation curve, a separate measurement at many different points is required to represent colors correctly.

you did not answer the question and imo you are making things way too complicated and creating FUD (Fear, Uncertainty and Doubt), to discredit a test that clearly shows the difference in color rendering between low CRI and High CRI

the comparison imo is totally valid, whatever discrepancy may exist to what the human eye sees is not relevant imo, the comparison clearly shows the difference between the two light sources…

maybe post some of your own photos, instead of criticizing my contribution and calling the example invalid

if you have a contribution to make to the original topic, please do so. Spare me your opinion about my contribution and just make your own contribution

Please don’t take this personally. I am not criticizing you or your contribution.

I am just pointing out that “green” in the cards we all see in these pictures and dislike so much, might have been seen as perfect “yellow” to the human eyes when the picture was taken. Or perhaps some other color. These two pictures were taken under two different light sources and neither was color calibrated. There is nothing to compare.

I think those pictures likely show what the eye sees when it comes to color fidelity. Low CRI cool whites do make colors look false and washed out. High CRI warm whites really do make colors look more vibrant, especially reds, orange, and yellows.

I’m not sure the green tint in the first picture is an accurate representation, though. In my experience, cameras really tend to over-emphasize the “ugly green” in Cree LEDs (both cool and neutral low-CRI tints). Cameras really do have a hard time getting the tint accurate in these cases, especially the balance between green and magenta.

The point is that CRI 100 at 2000K is very crappy light - a lot of colors are very poorly represented. While CRI 100 at 5000K is great light - all colors are represented properly. This subject is complex and poorly understood by many people. A lot of people here seem to think that CRI 100 means good color representation even at “warm” CCTs. It doesn’t.

My interests are varied, but include photography. Most of the time, therefore, I have a strong preference for 5000K light sources over ones at 3500K. I have some interest in the 5000K Nitecore TIP CRI. I have no interest in the 3500K version.

I am not a collector of flashlights. Since I do not wish to purchase a large number of flashlights, it is easy for me to limit my purchases (of small flashlights) to those with high CRI, CCT between, say, 4500K and 5500K, and neutral tint.

That’s a relatively new policy for me, and it has served as a good brake for my PayPal account. :money_mouth_face: