Lets Talk Tints

I agree with this for flashlights, in our house we have 3500-3750K in the lights (all LEDbulbs and spots)
find 5000-5500K ideal. No blue and no yellow/amber (yellow amber being as annoying as blue imho)
But thanks for the choices!

Tint is used rather broadly on flashlight boards. It is actually a technical term that is separate from Color Temperature (CCT)
For example, Look at the CCT 5000 Kelvin below.
Some people call this the beginning of Cool, but it would also be correct to say it is Warmer than 6000k. Confusion is eliminated by referencing the Color Temperature Number. Warm, Neutral and Cool are also valid terms but I want to stay focused on Tint first.
Look at the 3S Tint “bin” (box). It is a 5000k(+) Color with Yellow Tint. Compare that to the 3U box. It is a 5000k(-) Color with a more “rosy” tint.

LEDs do NOT have all the colors of sunlight. LEDs start out making a lot more blue than red. To make an LED produce Red, requires coating it with a phosphor that produces Red. In the process the total amount of light making it out of the reflector is lower when a phosphor coating is added to an LED. You get a light spectrum closer to sunlight by using a phosphor, but you give up lumens.

Now for sunlight. On the Color and Tint chart below there is a faint Dotted Line, called the BBL (BlackBodyLine). This is the IDEAL color and tint of an LED to MATCH sunlight at that given Color Temperature. Sunlight at Noon has a different Color Temperature than at Sunset.

For any given Color Temperature, most people will prefer the LED Tint that is closest to or BELOW the BBL. That is why we hear of people saying they like the Rosy tint of a Nichia (generally this means the LED has a phosphor coating that increases RED output.) CRI (color rendering index), goes UP as RED output goes UP. If you start with a Cool white 6000K LED, and add a phosphor to increase red output, the CCT will drop. For sake of example, in the case of a 4500K 90+ CRI Nichia 219b, the lumen output will drop along with the CCT (Corrected Color Temperature).

So, a 6000K XP-G2 will be Cooler and have more lumens than a 4500K Nichia. The Nichia will have higher CRI, will show reds better. The XPG2 will be brighter. For sake of reference, the Lumintop Tool with XP-G2 has a CRI of about 70, a Color Temperature of about 6000 kelvin, and will produce 110 lumens on high. The same Tool but with 4500k Nichia is 90+ CRI and produces 80 lumens. The 30 lumen difference is significant. If you had an 80 lumen light, and you could get an extra 30 lumens out of it, that is an increase of 38%!!!

So a lot of flashlights are sold based on having the highest Brightness. It drives the consumer market. That is how people compare lights, at first. After a while, some flashaholics begin to realize that cool white light does not really show the true color of some things very well. They then begin to appreciate the High CRI LEDs, for which the term Nichia is often used, because Nichia is like the Kleenex of high CRI LEDs. It is what they specialize in, their market niche. Anytime someone says Nichia, the implication is High CRI of 90+, and along with that it implies a warmer CCT than Cool White (which most people would agree includes the 6000k CCT). People also know that Nichia means less bright.

It is a tradeoff, if you want more red, you give up some brightness and some coolness.

Now for what makes some people prefer 6000k over 4500k

  1. given the same flashlight, a 6000k LED will be brighter. People prioritize brightness when they first learn about criteria to choose a flashlight.
  2. 6000k will show less Red. People looking at things that are red will prefer a different LED, even if they have to settle for less lumens.

Someone looking for a person, or their dog, 50 feet away in the backyard, will prefer the 6000k light. They want the most brightness (which also means the light will have the most throw). They want to see far, and brightly. In this use scenario, more lumens is more useful, than more Red rendering.

otoh, someone cooking a steak in their backyard, cutting it open to check if it is done, will prefer the High CRI light, even though it is less bright. At arms length they do not need maximum brightness, but they do need to see if the meat looks red or not. This scenario favors the Nichia.

so, which “tint” (actually color temperature), someone prefers, depends on how close or far the target is. Whether they need to use maximum or if medium is enough. And whether they need to see large forms, or want more emphasis on the ability to show reds.

This is where the ambient light comes into play. During the day, say 12 noon on a sunny day, my brain and eyes are adapted for bright light at high CCT. If I want to use a flashlight to look at things under the hood of the car, I will prefer a CCT that is in the 6000k range, over one in the 4500k range. The 6000k will seem more white, the 4500k will seem more orange. And the 6000k will be brighter, which is more necessary, when my eyes and brain are in ambient sunlight adaptation.

So, mechanics, and people who work under cool white light, will prefer 6000k light.

otoh, my house lighting is 3000k incandescent (incandescent light is more full spectrum than LED lighting). After being under 3000k lighting for an hour or more, my brain and eyes adapt to that Color Temperature. The brain basically does what a camera does when it sets its white balance. In this case 3000k becomes the “normal white” for that ambient Color Temperature. Now if I turn on a 6000k LED flashlight, the beam seems very blueish and glaring. Where as a 4000k LED will seem whiter and brighter than ambient, but not so harsh as 6000k

For someone like me, that uses a small AAA flashlight around the house in the evening, to look for some wiring behind the TV, or look for some shoes at the back of the closet, my 3000k or 4500k flashlights will be preferred, over a 6000k light. Because the ambient light I am operating in, has set my brain’s white balance to a warmer Color than Sunlight at noon.

Since I dont have a dog, do not work as a car mechanic, and do not use a flashlight to hunt for intruders in my back yard, a 6000k light is not my priority. Since I do like to have a small AAA in my pocket, that I can use to check if the steak is done when grilling in the dark, or to find my red slippers in the dark corner of my closet, or to do other things at arms reach where seeing red things correctly matters, I “prefer” a 3000k or 4500k led. I also want them to be High CRI. Since High CRI is a priority for me, I mostly avoid lights with 6000k Color Temperature. Usually a 6000k LED will only have about 70CRI. It will not produce enough red to show red objects as red. They will look more brownish than under full spectrum sunlight, or incandescent light.

Bottom line, the Color Temperature we prefer, changes with the Color Temperature of the background lighting our brain is adjusted to at the time.

Whether or not one prioritizes High CRI, depends on how important the ability to see red is, for a given flashlight application.

Looking at food with a flashlight will favor High CRI choices. Looking at things at close range, will favor low brightness and High CRI. Looking for a red tabby cat up a tree in the dark will favor higher brightness and High CRI. Looking for a black labrador running around on a football field in the dark, will favor high brightness and CRI will be a lower priority.

jon_slider

Thank you for the detailed explanation; I found it very useful. :slight_smile:

This is an excellent post.

Great explanation, thanks a lot. One question: why 3000K at your home? Do cooler versions of such high CRI lights exist?

thanks for the kind words everyone

Incandescent bulbs (non-LED) are full spectrum, so 100 CRI
When I went to buy the bulbs, I bought the ones with the highest Kelvin, 3000k
It is definitely warm white, which I find relaxing and pleasant for hanging out.

If I was working, instead of relaxing, then Cool White might be more appropriate, but I dont like Cool White for relaxing.

I had the Phillips Hue LED system in my house for a while. It let me play with a whole range of Color temperatures and tints. I loved them, but they use PWM (a subject for another thread maybe), so I went back to Incandescent for the full spectrum and warmth.

At one point I would use the Hue system as an alarm clock, my bedroom overhead light would turn on a deep red at a low intensity. 15 minutes later it would change to a golden yellow, and brighter. 15 minutes later the room would be lit up in Cool White at max intensity, at which point staying in bed was out of the question :slight_smile:

different Colors and brightnesses create different moods and support different behaviors.

You are a sound sleeper. :slight_smile:

not if the LED is not producing much red to begin with
a High CRI LED could be red filtered to show more red (warmer) (this is an N219a 4500k and 90 CRI)

but a low CRI LED that does not produce much red, wont show much when filtered either (This is an XP-G2 6000k and 70CRI

here is another example of the difference in color content of a Nichia vs an XP-G2

see how a red filter would do little to capture red, when so little is there to begin with?:slight_smile:

Jon_slider

Thank you for these materials and explanations; they are very helpful.

One question I do have is concerning light and energy. Are these two terms interchangeable?

interesting question… not sure, Im no expert… what makes you ask?
red light has less energy than blue light btw

heres my thinking
if we dont specify the type or color of the energy it may seem like a light with more lumens has more energy
but a light with more red, even though it produces less lumens, has more Red Light,

assume you have two identical lights, a Tool with XP-G2, and a Tool with Nichia.

both would have the same amount of energy going INTO the LED, but not the same amount of Total light (lumens) coming out, and not the same amount of Red Light coming out.

The original question of “Tint” preference is linked to the question of CRI preference, which is linked to Brightness preference, which is linked to application preference…

High CRI is usually Warmer “Tint” and less brightness
High Brightness is usually Low CRI and Cooler “Tint”

That is the answer I was looking for, I was trying to understand light, energy and color and their interrelationship. Thank you for a clear answer! :slight_smile:

I think people don’t like NW or warm whites because they’ve never seen a really good one …
The Nichia 219A is the standard for all good tints IMHO
it’s not all about cri . i have enogh great tints to change anyones mind about tints ,Being a tint snob makes you look down on tints other people are impressed with .:stuck_out_tongue:

it takes a pretty big pool of lights to figure out what you really like .
lights with tints you think are pretty nice look like garbage when compared side by side with really good tints .

I am torn between output and pretty tints so i would say maybe a 5000k tint.

But all tints have there places WE DON’T JUDGE hahaha except for me i hate really blue tints! why not just use a laser?

I dont see what you mean. The Nichia 219 is ALL about CRI. It just happens to have a 4500k “tint” but the same 4500k in a Zebralight, which does NOT use a Nichia, is not high CRI and is not as nice a “tint”

maybe show some beamshots of what you consider a good “tint” that is not also High CRI?

Just a slight nit-pick. Incandescent bulbs are not full-spectrum. They are black-body sources of light. Full-spectrum includes the entire spectrum of visible light, plus infrared and near ultraviolet. While incandescent lights are good sources of infrared, they are generally poor sources of blue wavelengths and anything shorter (like violet and ultraviolet).

Lumens is not a measure of energy of the light. Lumens is an approximation to how bright the human eye perceives light. Since our eyes are most sensitive to green wavelengths, then a light source with a lot of green light in it will have more lumens, given the same energy. A blue light source, even if it has a bit more energy, will have less lumens. Same for red. And, of course, outside the visible spectrum the light will have zero lumens, even though it may have lots of energy.

Sometimes you’ll see light intensity given in terms of milliwatts per square centimeter, or something like that. That’s describing the energy intensity, rather than lux.

Thank you. I welcome the education.

Is it accurate to say that Incandescent bulbs are 100CRI?
Is it accurate to say that Incandescent bulbs are Full Visible Spectrum?

or are those also misuses of terminology?

Thank you.

It is interesting that there is a difference between light and the what the human eye and brain perceive. Much of the Electromagnetic Spectrum is not visible to the human eye.

“Color rendering index, or CRI, is a measure of the quality of color light, devised by the International Commission on Illumination (CIE). It generally ranges from zero for a source like a low-pressure sodium vapor lamp, which is monochromatic, to one hundred, for a source like an incandescent light bulb, which emits essentially blackbody radiation. It is related to color temperature, in that the CRI measures for a pair of light sources can only be compared if they have the same color temperature. A standard ”cool white” fluorescent lamp will have a CRI near 62.’

‘CRI is a quantitatively measurable index, not a subjective one. A reference source, such as blackbody radiation, is defined as having a CRI of 100 (this is why incandescent lamps have that rating, as they are, in effect, blackbody radiators), and the test source with the same color temperature is compared against this.’

I found this on a search and cannot verify as to the accuracy.

Yes. However CRI can be a poor standard to base accurate color rendition on.

First, CRI is by definition 100, regardless of the CCT (temperature or tint of the light), if it matches a blackbody source of light (such as an incandescent light). But a CRI 100 with a CCT of 1500K is vastly different than a CRI 100 with a CCT of 10000K. The former is heavily weighted in red light, while the latter is more saturated in blue light. Things are going to look very different under those lighting conditions, and colors will not be accurately identified in either.

Second, the CRI standard uses a limited number of test colors to determine the CRI of a light source. I think it’s around 14, IIRC. Usually, that’s good enough. But, it under-represents deep red as a color.

I really need both the CRI and the CCT (which is even more important to me), when figuring out what kind of LED lighting I want. Tint (green or magenta) is also very useful. I like tints on the magenta side (like Nichia 219 LEDs), not the green side (like most Cree LEDs).

I think that’s sort-of misusing the terminology. Incandescent lighting can be full spectrum, but you’d have to get the filament glowing much hotter than the typical ~2500K. 5500K would get you a spectrum similar to the sun, which is full spectrum. I’m not aware of materials that can take that kind of temperature, but I’m sure there must be some.

But generally, incandescent lighting is very poor in blue wavelengths. It’s there, but not very much of it. That’s why it’s hard to distinguish deep blue and black, or blue and violet colors under standard incandescent lighting, unless it’s bright.

There is much to know beyond a CRI number.

It reminds me of a quote by Denis McKenna: ‘The bigger the bonfire, the more darkness is revealed.’. :slight_smile: