Lets Talk Tints

jon_slider

Thank you for the detailed explanation; I found it very useful. :slight_smile:

This is an excellent post.

Great explanation, thanks a lot. One question: why 3000K at your home? Do cooler versions of such high CRI lights exist?

thanks for the kind words everyone

Incandescent bulbs (non-LED) are full spectrum, so 100 CRI
When I went to buy the bulbs, I bought the ones with the highest Kelvin, 3000k
It is definitely warm white, which I find relaxing and pleasant for hanging out.

If I was working, instead of relaxing, then Cool White might be more appropriate, but I dont like Cool White for relaxing.

I had the Phillips Hue LED system in my house for a while. It let me play with a whole range of Color temperatures and tints. I loved them, but they use PWM (a subject for another thread maybe), so I went back to Incandescent for the full spectrum and warmth.

At one point I would use the Hue system as an alarm clock, my bedroom overhead light would turn on a deep red at a low intensity. 15 minutes later it would change to a golden yellow, and brighter. 15 minutes later the room would be lit up in Cool White at max intensity, at which point staying in bed was out of the question :slight_smile:

different Colors and brightnesses create different moods and support different behaviors.

You are a sound sleeper. :slight_smile:

not if the LED is not producing much red to begin with
a High CRI LED could be red filtered to show more red (warmer) (this is an N219a 4500k and 90 CRI)

but a low CRI LED that does not produce much red, wont show much when filtered either (This is an XP-G2 6000k and 70CRI

here is another example of the difference in color content of a Nichia vs an XP-G2

see how a red filter would do little to capture red, when so little is there to begin with?:slight_smile:

Jon_slider

Thank you for these materials and explanations; they are very helpful.

One question I do have is concerning light and energy. Are these two terms interchangeable?

interesting question… not sure, Im no expert… what makes you ask?
red light has less energy than blue light btw

heres my thinking
if we dont specify the type or color of the energy it may seem like a light with more lumens has more energy
but a light with more red, even though it produces less lumens, has more Red Light,

assume you have two identical lights, a Tool with XP-G2, and a Tool with Nichia.

both would have the same amount of energy going INTO the LED, but not the same amount of Total light (lumens) coming out, and not the same amount of Red Light coming out.

The original question of “Tint” preference is linked to the question of CRI preference, which is linked to Brightness preference, which is linked to application preference…

High CRI is usually Warmer “Tint” and less brightness
High Brightness is usually Low CRI and Cooler “Tint”

That is the answer I was looking for, I was trying to understand light, energy and color and their interrelationship. Thank you for a clear answer! :slight_smile:

I think people don’t like NW or warm whites because they’ve never seen a really good one …
The Nichia 219A is the standard for all good tints IMHO
it’s not all about cri . i have enogh great tints to change anyones mind about tints ,Being a tint snob makes you look down on tints other people are impressed with .:stuck_out_tongue:

it takes a pretty big pool of lights to figure out what you really like .
lights with tints you think are pretty nice look like garbage when compared side by side with really good tints .

I am torn between output and pretty tints so i would say maybe a 5000k tint.

But all tints have there places WE DON’T JUDGE hahaha except for me i hate really blue tints! why not just use a laser?

I dont see what you mean. The Nichia 219 is ALL about CRI. It just happens to have a 4500k “tint” but the same 4500k in a Zebralight, which does NOT use a Nichia, is not high CRI and is not as nice a “tint”

maybe show some beamshots of what you consider a good “tint” that is not also High CRI?

Just a slight nit-pick. Incandescent bulbs are not full-spectrum. They are black-body sources of light. Full-spectrum includes the entire spectrum of visible light, plus infrared and near ultraviolet. While incandescent lights are good sources of infrared, they are generally poor sources of blue wavelengths and anything shorter (like violet and ultraviolet).

Lumens is not a measure of energy of the light. Lumens is an approximation to how bright the human eye perceives light. Since our eyes are most sensitive to green wavelengths, then a light source with a lot of green light in it will have more lumens, given the same energy. A blue light source, even if it has a bit more energy, will have less lumens. Same for red. And, of course, outside the visible spectrum the light will have zero lumens, even though it may have lots of energy.

Sometimes you’ll see light intensity given in terms of milliwatts per square centimeter, or something like that. That’s describing the energy intensity, rather than lux.

Thank you. I welcome the education.

Is it accurate to say that Incandescent bulbs are 100CRI?
Is it accurate to say that Incandescent bulbs are Full Visible Spectrum?

or are those also misuses of terminology?

Thank you.

It is interesting that there is a difference between light and the what the human eye and brain perceive. Much of the Electromagnetic Spectrum is not visible to the human eye.

“Color rendering index, or CRI, is a measure of the quality of color light, devised by the International Commission on Illumination (CIE). It generally ranges from zero for a source like a low-pressure sodium vapor lamp, which is monochromatic, to one hundred, for a source like an incandescent light bulb, which emits essentially blackbody radiation. It is related to color temperature, in that the CRI measures for a pair of light sources can only be compared if they have the same color temperature. A standard ”cool white” fluorescent lamp will have a CRI near 62.’

‘CRI is a quantitatively measurable index, not a subjective one. A reference source, such as blackbody radiation, is defined as having a CRI of 100 (this is why incandescent lamps have that rating, as they are, in effect, blackbody radiators), and the test source with the same color temperature is compared against this.’

I found this on a search and cannot verify as to the accuracy.

Yes. However CRI can be a poor standard to base accurate color rendition on.

First, CRI is by definition 100, regardless of the CCT (temperature or tint of the light), if it matches a blackbody source of light (such as an incandescent light). But a CRI 100 with a CCT of 1500K is vastly different than a CRI 100 with a CCT of 10000K. The former is heavily weighted in red light, while the latter is more saturated in blue light. Things are going to look very different under those lighting conditions, and colors will not be accurately identified in either.

Second, the CRI standard uses a limited number of test colors to determine the CRI of a light source. I think it’s around 14, IIRC. Usually, that’s good enough. But, it under-represents deep red as a color.

I really need both the CRI and the CCT (which is even more important to me), when figuring out what kind of LED lighting I want. Tint (green or magenta) is also very useful. I like tints on the magenta side (like Nichia 219 LEDs), not the green side (like most Cree LEDs).

I think that’s sort-of misusing the terminology. Incandescent lighting can be full spectrum, but you’d have to get the filament glowing much hotter than the typical ~2500K. 5500K would get you a spectrum similar to the sun, which is full spectrum. I’m not aware of materials that can take that kind of temperature, but I’m sure there must be some.

But generally, incandescent lighting is very poor in blue wavelengths. It’s there, but not very much of it. That’s why it’s hard to distinguish deep blue and black, or blue and violet colors under standard incandescent lighting, unless it’s bright.

There is much to know beyond a CRI number.

It reminds me of a quote by Denis McKenna: ‘The bigger the bonfire, the more darkness is revealed.’. :slight_smile:

I have never paid much attention to CRI and neutral/warm tints, always been a big fan of CW but after so many XP-L HI V2 1A mods I admit it starts to get boring. Made a triple Nichia 219C and ordered a 5A BLF D80 some time ago but it was definitely too warm for me, it looked “dirty”.

Decided to try my luck again and got a 3C EagleEye X2R http://www.gearbest.com/led-flashlights/pp_411562.html. I was blown away by the tint!! Everything looked so much vibrant and the temperature was simply pleasing to the eyes. I don’t even know what bin this 3C tint is or if it is high CRI of any kind, it looks a lot like my armytek wizzard XHP50 “CW” which is actually more like 5000k, but somehow the tint of this 3C is prettier, and wherever I shine it to it gives a better sense of depth. Not sure how to explain it…

After playing a bit more with the light I decided to built my first ever high CRI light. I’m looking for a emitter which is not warmer than this 3C but with higher CRI. It can be XP or XM as I have both centering rings. Looking for maximum brightness as well with a FET driver so no nichia… Any recommendations?

Thanks!

Cree XHP35 is available with 90+ CRI even at 5000K and 5700K CCT (your 3C is supposed to be between 4750K and 5000K CCT). Note that it is a 12v emitter.