There are plenty of applications where standard or even low CRI emitters are still acceptable. They’re cheaper to produce, and that’s a big factor of course. One example would be very utility oriented outdoor lighting where perhaps you just need simple illumination to assist with security cameras where infrared alone can’t give what you want. Many people don’t use or enjoy flashlights the way we do, maybe just keep one or a few around the house or office for emergency use, and low CRI is fine there, too.
Whilst that statement is true it is somewhat offensive to those that don’t see/ care for the difference. “Denial” also does not encompass colourblindness in the right manner.
I think this is the point rayfish is trying to make. From a careless perspective- the “best” cri is only a subtle difference anyway so what is the big deal?
Sure I get it makes a difference for discerning pairs and a few other job specific tasks. I also get that other people can see slightly better than me, rayfish and a few others… but conversely relying on high cri and not full spectrum is just as bad as confusing high and low cct (which I can see perfectly well).
I don’t mean to be offensive or a troll just saying things how I see it (pun…?? :student: ).
As long as we keep it civil, I still think this is a good discussion point. I'd like more proof with pictures, and I think if more people do it, instead of sharing some images off the internet.
I've tipped my toes into the High CRI LEDs about 7-8? years ago, when the only real High CRI was basically the Nichia 219b in warm white. Yes, it was so warm, that some colors were just washed away. Even though that time, people were also swearing by it..
Right now, High CRI is available in higher CCT LEDs, which makes it a more interesting option.
I wonder what people have to say about rayfish's last comparision.. I think that's a great start for more discussion...
It’s more than a subtle difference. It is a big deal. Seeing is believing, if you can’t see it you’re not going to believe. You may not think you’re in denial but your comments in this thread and others indicate to me and others that you are in denial. You want to minimize the difference and chastise others for trying to explain it. I and others appreciate hearing your experience with high Cri versus low Cri but don’t try to tell the 90% of the population that it’s a subtle difference and not really important when you can’t see what we see.
I'd like to see some comments on post #119.
If that's the case, there is not much difference, and it might be influenced by the CCT?
It actually started me to look into some cheap lights (Convoy) that have the same CCT but with high and low CRI.. and maybe get a few and see the difference myself..
That'd be the best comparison?
If you guys can help me, that would be appreciated:
Maybe this:
- Convoy S2+ Luminus SST20 (4000K)
- Convoy S2+ Nichia 219C (4000K)
- Convoy S2+ Cree XML2 T5 B5 (4000-4200K)
- Convoy S2+ XPL HI U6-4C (4200-4500K)
I’m pretty sure I don’t have any light bulb that’s higher than 80 CRI in my house, and yet I have no problems in discerning colors, so I must admit I’m a non believer. Gonna get me a high CRI led in my next flashlight, but I must say, just by the pictures I’ve seen, except for outdoor uses, I feel that people put a LOT more emphasis on high CRI than they should. But then again, some folks can’t stand classical music and I enjoy it. Some lean left and some right. Some drink Coke and some prefer Pepsi. Some prefer High CRI, other’s don’t.
Can you just read article in WIKI
It might be interesting to take a Sofirn IF25 and replace the pairs of led’s with 4 of the same CCT with one pair the best CRI and the other pair of a lesser CRI.
For me, at this point, I'm just curious how much of a difference they make when the CCT is the same (or very close) in the same flashlight.
I own many flashlights, but they are all different outputs, different reflectors, different CCT, different CRI etc..
Maybe that's going a bit too much off topic though, since you guys are talking about the differences etc.. so I'd like to check this myself with different LEDs in the same flashlight.
So I'll create a new topic for this :D
edit: here it is: https://budgetlightforum.com/t/-/69051
I don’t know if I can see the difference. I know my current favorite LED is high CRI 4000k SST20.
Jon_Slider: I sure can see the difference in your example. Now the unfortunate part to me is that depending on the particular tomatoes, the fat content of the meat and how long on stove, either of those two pictures could be the most accurate.
Rayfish’s examples make no sense to me. Perhaps my understanding is off. Would seem to me that if you were trying to demonstrate the difference in LED’s then you would do a white balance with ambient light and use that balance for all the LEDs. Setting the white balance for each light is letting the camera do the color corection.
I don’t believe anybody is saying that Cri and Cri alone is what is most important. Cct is a major part of the equation. For many, 3000k to 4000k plus or minus a little and higher cri is the sweet spot for looking at the world and better understanding what you’re looking at. Looking at White Walls or tomato paste cans is also only a very small part of it. Depth perception when looking beyond close objects such as beyond some tree trunks that are close to you versus other trees that are farther away. Distinguishing brown and gray leaves from brown and gray rocks and brown and gray roots on unmaintained trails. You cannot take one piece of the puzzle and see the big picture. And if you are somewhat or alot color blind you cannot and will not ever see the big picture.
I had a friend that is color blind.
His father was not color blind.
He was sure that he got his color blindness from his father, which is impossible.
Sadly, he did not understand genetics as well as I do.
I’m not in either camp, my eye likes the in between ground. Peoples cri ‘liking’ choice is entirely up to them, and I would not argue either way. That is not the point of this comment I’m thinking ‘out of the pan’ lol
What I did find interesting between the two images though looking past the pan is the loss/gain of detail on your worktop. On the bottom image, low cri brings out the white/light grey flecks in your worktop and shows them very clearly, but in the high cri version they are almost completely is drowned out in comparison, certainly it is a lot less obvious and a whole lot of detail is lost. You can barely make out the worktop to the top right of the photo of high cri.
So, whilst the high cri one enhances the red/orange it significantly impacts the white/grey/black, whilst the low cri one tones right down the red into a brownish colour but enhances the whites/lighter detail of the worktop.
Blacks and whites (and in between greys) exist to our eyes as do the colours and all have their part to play as to what each individuals eyes percieve as a ‘nice looking’ image and the level of detail across the board.
I would genuinely be interested in seeing an ‘in the middle’ cri shot of the same setup
Still don’t get it. Why would anyone take low cri which is old and obsolete technology vs. superior modern and efficient sunlight replica technology also known as HIGH CRI? Just doesn’t make any sense to want inferior leds.
Oh… I don’t know, maybe there is a thing called tradeoffs? Maybe someone doesn’t want to pay double the price for half the lumens? What’s your user name again?
Comments like this is why it seems like there is a high CRI cult here and on Reddit.
It’s more than a subtle difference. It is a big deal. Seeing is believing, if you can’t see it you’re not going to believe. You may not think you’re in denial but your comments in this thread and others indicate to me and others that you are in denial. You want to minimize the difference and chastise others for trying to explain it. I and others appreciate hearing your experience with high Cri versus low Cri but don’t try to tell the 90% of the population that it’s a subtle difference and not really important when you can’t see what we see.
I’m not trying to minimalize the difference. Others who appreciate the difference don’t go WOWZERZ either…
… Seeing, or appreciating that difference, took some time to develop to where I gave it some serious thought.
Like many other flashaholics on these forums, my preferences have gradually evolved to prefer high-CRI.
Alrighty I got my E2A today, and that was the light that calibrated my eyes. Now I can discern the difference better thanks to the more accurate SST-20.
Still not a huge difference, but I’m catching on.
Oh and just because I’m colourblind to a small range of colours doesn’t mean my eyesight is all screwed up. That’d be like saying someone who doesn’t know the letters X and Y can’t read.
…who doesn’t know the letters X and Y can’t read.
There is an X and a Y?
Yeh, they are wonderful letters. They even use them in mathematics equations and 3 dimensional positioning ( x axis, y and z ) , and they even use it in genetics! (XX and XY). Where would the world be without x and y ….
Interesting pics, rayfish, thanks for taking the time to do all of that.
I was having a hard time sorting through the collage, thought for a second that maybe a row was in the wrong location, but then it hit me that we were looking at white balance and whatever image processing the camera is doing (saturation, for example, which can be omitted with some camera firmwares but not (or not entirely) with others. That makes it hard to compare with what the human eye might see more fully in person. But also, you don’t really have any low CRI emitters to work with I guess…which I guess we would call > 75 CRI (or 70ish TLCI).
And yes, color temp makes a huge difference, so the best test would be bench power supply, same setup with mounting/reflector (reflector being better for this than optics maybe), and same temp from the same manufacturer in different CRI bins. With the “regular” CRI you have there in the 80+ range, the differences aren’t so dramatic except maybe with skin tones and a few colors…but again, temperature dependent. Slightly warmer temps do fill in some of what’s missing in 6000+ cold temps and that can effectively make up for CRI as far as many eyes are concerned. When look compare a truly low CRI emitter with a high or regular, it stands out more (this is very true in LEDs used in office lighting as well, although most have settled on something like 75+ as acceptable…because that’s what the manufacturers of those kinds of bulbs want to offer, generally). What’s interesting is the few higher temp high CRI choices that are popping up now.
Saw ChibiM’s….companion….thread and maybe if he or someone follows through there that could be interesting or revealing as well. I thought we’d sort of hashed all this out in old threads here over the years so I still get a little surprised at the banter (let’s call it banter) over this subject. It’s awesome that we have the choices now, take ’em or leave ’em as suits.
On loss of lumens and efficiency…I think that’s a bit overstated. Sure, we see loss in measurements and we know exactly why, but many people don’t really see that loss in viewing with the eye. So we take a product that is only 20% efficient to begin with, but we do ok with it and people are generally happy with performance, and we knock off just a tad for better temperatures or higher CRI, and mostly people are still happy (or happier, often) with that, and there’s no real appreciable loss to the eye in brightness (assuming same temps compared, of course). I think thus far all of the high CRI offerings have been capable of the same current limits as their lower-rendering predecessors, no?
Funny, but this subject (and looking at the manufacturers of other led lighting products as well) just makes me think of the whole “good enough” mindset/ethics of the Chinese manufacturers that we so often quibble about (like the Lumintop FW3A fiasco). It’s like, hey, we know something could be better, but why bother since it’s a bother? I’m glad everyone got past that enough to produce white high power leds rather than sticking with blue or old low power models…someone thought it was worth the bother, and again with pursuing better color rendition. Really we all benefit from that, or could.
Also…about the camera comment. I know we do the best we can do but the limitations of the image sensors and then the built-in firmware adjustments just make it hard to end up with a true-to-the-eye image on many things. Can RAW files and the necessary manual processing make up for that? I know that when I’ve toyed with this a little (with a mediocre camera) I really wasn’t able to reproduce images that looked like what I saw in my light beams…but by the eye I can notice some significant differences, depending.
I’ve noticed this with camera sensors as well. Certain LEDs, such as XP-L HI 4000K 70 CRI, appear very rosy to my eye but when I try to photograph them (even with my D850) it looks quite green. I assume it has to do with the digital sensor’s sensitivity to certain wavelengths being different than my eye.
I find it very difficult to capture the improvement in contrast and depth on an outdoor scene that is illuminated with high CRI when using a camera. The difference is striking in person when switching between two lights, but I just can’t seem to replicate it in an image.