Lets Talk Tints

interesting question… not sure, Im no expert… what makes you ask?
red light has less energy than blue light btw

heres my thinking
if we dont specify the type or color of the energy it may seem like a light with more lumens has more energy
but a light with more red, even though it produces less lumens, has more Red Light,

assume you have two identical lights, a Tool with XP-G2, and a Tool with Nichia.

both would have the same amount of energy going INTO the LED, but not the same amount of Total light (lumens) coming out, and not the same amount of Red Light coming out.

The original question of “Tint” preference is linked to the question of CRI preference, which is linked to Brightness preference, which is linked to application preference…

High CRI is usually Warmer “Tint” and less brightness
High Brightness is usually Low CRI and Cooler “Tint”

That is the answer I was looking for, I was trying to understand light, energy and color and their interrelationship. Thank you for a clear answer! :slight_smile:

I think people don’t like NW or warm whites because they’ve never seen a really good one …
The Nichia 219A is the standard for all good tints IMHO
it’s not all about cri . i have enogh great tints to change anyones mind about tints ,Being a tint snob makes you look down on tints other people are impressed with .:stuck_out_tongue:

it takes a pretty big pool of lights to figure out what you really like .
lights with tints you think are pretty nice look like garbage when compared side by side with really good tints .

I am torn between output and pretty tints so i would say maybe a 5000k tint.

But all tints have there places WE DON’T JUDGE hahaha except for me i hate really blue tints! why not just use a laser?

I dont see what you mean. The Nichia 219 is ALL about CRI. It just happens to have a 4500k “tint” but the same 4500k in a Zebralight, which does NOT use a Nichia, is not high CRI and is not as nice a “tint”

maybe show some beamshots of what you consider a good “tint” that is not also High CRI?

Just a slight nit-pick. Incandescent bulbs are not full-spectrum. They are black-body sources of light. Full-spectrum includes the entire spectrum of visible light, plus infrared and near ultraviolet. While incandescent lights are good sources of infrared, they are generally poor sources of blue wavelengths and anything shorter (like violet and ultraviolet).

Lumens is not a measure of energy of the light. Lumens is an approximation to how bright the human eye perceives light. Since our eyes are most sensitive to green wavelengths, then a light source with a lot of green light in it will have more lumens, given the same energy. A blue light source, even if it has a bit more energy, will have less lumens. Same for red. And, of course, outside the visible spectrum the light will have zero lumens, even though it may have lots of energy.

Sometimes you’ll see light intensity given in terms of milliwatts per square centimeter, or something like that. That’s describing the energy intensity, rather than lux.

Thank you. I welcome the education.

Is it accurate to say that Incandescent bulbs are 100CRI?
Is it accurate to say that Incandescent bulbs are Full Visible Spectrum?

or are those also misuses of terminology?

Thank you.

It is interesting that there is a difference between light and the what the human eye and brain perceive. Much of the Electromagnetic Spectrum is not visible to the human eye.

“Color rendering index, or CRI, is a measure of the quality of color light, devised by the International Commission on Illumination (CIE). It generally ranges from zero for a source like a low-pressure sodium vapor lamp, which is monochromatic, to one hundred, for a source like an incandescent light bulb, which emits essentially blackbody radiation. It is related to color temperature, in that the CRI measures for a pair of light sources can only be compared if they have the same color temperature. A standard ”cool white” fluorescent lamp will have a CRI near 62.’

‘CRI is a quantitatively measurable index, not a subjective one. A reference source, such as blackbody radiation, is defined as having a CRI of 100 (this is why incandescent lamps have that rating, as they are, in effect, blackbody radiators), and the test source with the same color temperature is compared against this.’

I found this on a search and cannot verify as to the accuracy.

Yes. However CRI can be a poor standard to base accurate color rendition on.

First, CRI is by definition 100, regardless of the CCT (temperature or tint of the light), if it matches a blackbody source of light (such as an incandescent light). But a CRI 100 with a CCT of 1500K is vastly different than a CRI 100 with a CCT of 10000K. The former is heavily weighted in red light, while the latter is more saturated in blue light. Things are going to look very different under those lighting conditions, and colors will not be accurately identified in either.

Second, the CRI standard uses a limited number of test colors to determine the CRI of a light source. I think it’s around 14, IIRC. Usually, that’s good enough. But, it under-represents deep red as a color.

I really need both the CRI and the CCT (which is even more important to me), when figuring out what kind of LED lighting I want. Tint (green or magenta) is also very useful. I like tints on the magenta side (like Nichia 219 LEDs), not the green side (like most Cree LEDs).

I think that’s sort-of misusing the terminology. Incandescent lighting can be full spectrum, but you’d have to get the filament glowing much hotter than the typical ~2500K. 5500K would get you a spectrum similar to the sun, which is full spectrum. I’m not aware of materials that can take that kind of temperature, but I’m sure there must be some.

But generally, incandescent lighting is very poor in blue wavelengths. It’s there, but not very much of it. That’s why it’s hard to distinguish deep blue and black, or blue and violet colors under standard incandescent lighting, unless it’s bright.

There is much to know beyond a CRI number.

It reminds me of a quote by Denis McKenna: ‘The bigger the bonfire, the more darkness is revealed.’. :slight_smile:

I have never paid much attention to CRI and neutral/warm tints, always been a big fan of CW but after so many XP-L HI V2 1A mods I admit it starts to get boring. Made a triple Nichia 219C and ordered a 5A BLF D80 some time ago but it was definitely too warm for me, it looked “dirty”.

Decided to try my luck again and got a 3C EagleEye X2R http://www.gearbest.com/led-flashlights/pp_411562.html. I was blown away by the tint!! Everything looked so much vibrant and the temperature was simply pleasing to the eyes. I don’t even know what bin this 3C tint is or if it is high CRI of any kind, it looks a lot like my armytek wizzard XHP50 “CW” which is actually more like 5000k, but somehow the tint of this 3C is prettier, and wherever I shine it to it gives a better sense of depth. Not sure how to explain it…

After playing a bit more with the light I decided to built my first ever high CRI light. I’m looking for a emitter which is not warmer than this 3C but with higher CRI. It can be XP or XM as I have both centering rings. Looking for maximum brightness as well with a FET driver so no nichia… Any recommendations?

Thanks!

Cree XHP35 is available with 90+ CRI even at 5000K and 5700K CCT (your 3C is supposed to be between 4750K and 5000K CCT). Note that it is a 12v emitter.

I remember I had one Osram LED (not golden dragon) that has one main white LED and a red diode on the side to compensate the tint

When it’s in moonlight mode the LED goes really pinkish warm but gets really blue when on max 120 lumens

It makes me wonder if adding red would change the perception of fake CRI? It surely didn’t work when I tried an XML2 1A tint with an XRE red emitter but the reddish tint is somewhat pleasant

Which makes me wonder if this is why dive lights needs lots of red in their searchlights

Longer wavelengths (warmer) scatter less light, so are better at cutting through fog, humid air, and water. Perhaps that’s why diving searchlights use warmer tints with more red in the light.

+1

I am too rather new to understanding tints as you can read from my comments on this thread, but like yourself I have grown fond of a 5000K 3C tint. I now know better the reasons why. :slight_smile:

Thanks to jon_slider and WalkIntoTheLight for the outstanding explanations they posted.

  • "Neutral" tints are those found along the black-body radiator line.
  • At a given CCT, above the black-body radiator line you will find "greenish" tints. Below are magenta tints.

Sidecross also shared this gem from FullSpectrumSolutions:

  • "[CRI] is related to color temperature, in that the CRI measures for a pair of light sources can only be compared if they have the same color temperature."
  • Thus, a 100 CRI (Ra) source with a CCT of 2750K will not do as good a job rendering the visible color spectrum as a 100 CRI (Ra) source with a CCT of 5200K.

A good for-instance is the new Nitecore TIP CRI. It uses a high-CRI Nichia 219B emitter. Although there is talk of a 5000K version, for now all that is available is the 3500K version. At 3500K, the TIP CRI can advertise 90+ CRI, but that is not the same thing as 90+ CRI at 5000K.

Unless you’re doing work that specifically requires you to accurately see all colors (such as an electrician), I wouldn’t get too hung up about 5000K temperatures. It’s true that 3500K isn’t going to show you much color that is blue or violet, but it will do a very good job showing you reds and oranges.

Emphasizing reds may be a better thing to do, in certain circumstances. If you’re looking at human skin, for example. Or if you’re walking in the woods. Or if you just enjoy warmer tints because you find them more pleasing and less tiring on the eyes.

There isn’t much blue or violet color in the natural world (except for the sky). So you’re not missing out on much if you use warmer tinted lights. Getting a better separation of colors in the reddish bands may let you see better.

+1

These are pics from a Nitecore Tip and TipCRI

Tip: Low CRI 6000K… look how green the yellow cards look. I dont “see” any advantage in the blue from the Low CRI, and imo it makes the Silver look too blue.

TipCRI: look how much more realistic the High CRI 3500K makes the yellow card. Look at the Red, Gold and Silver too.