But seriously, seeing this and the review of the Bandicoot reminded me that arguing over discussing CRI is an indication of how far most flashlights have come. I’ve only been here since 2013, but I’ve NEVER seen a serious argument that visible PWM is good. Anybody who thinks visible PWM is good is obviously part of a brainwashed cult.
She would always like that! Of course, like many times, I was not Incandescent is wrong! My color choices have changed and I now prefer 4000k to 6000k. Nothing wrong with my country friends liking 6000K. Might even be genuinely better at certain tasks. I just prefer warmer.
The perception of light by the physical eyes is a biased thing. Different people have different visual acuity and color perception, with the latter being at the forefront of this discussion.
Now, if you are to understand that low CRI emitters are made to boast higher lumen figures at the price of lower color accuracy, this could be fine. Let's say that low CRI emitters with higher lumens are not better, just different. It's as respectable as I can be here, because in practice true flashlight lovers end up discovering that their EDC lights are high CRI, while their low CRI units tend to end up being shelf queens. This usually doesn't happens spontaneously, but over time.
If you don't really care about colour accuracy, get some green emitters and see if you like them (example: Osram Ostar Projection PC-green tested, KP CSLNM1 F1), they can provide you maximum green lemons per watt.
:facepalm:
The loss in lumens is very bad and the flashlight gets hot very fast bla bla bla? Are you trying to imply that the flashlight gets hotter with a high CRI emitter versus a low CRI one?
There's only one case in which the flashlight heating would be lower and this is with a low Vf emitter versus a high Vf emitter in a flashlight using a switching driver (buck, boost or buck-boost driver), and in any case the difference would be very veery small, and it has to do with the Vf of the emitter (and not the CRI) because (regulated) drivers are regulated by current flow, and current flow times voltage (Vf) means power at the emitter, and for this reason when the emitter Vf is higher the power at the emitter and overall dissipated power is also higher. So only when using switching drivers, and more than likely using some accurate thermometer to try to measure a veery tiny difference, you could say that there's a difference in heating with low Vf emitters versus high Vf emitters.
And if using a more typical regulated driver (AMC7135's and regulated MOSFET units -linears-), the difference is none for as long as the battery voltage is higher than the emitter Vf, which keeps the driver in regulation, and only when the battery voltage goes lower than the emitter Vf the current flow starts diminishing and so does the flashlight heating. And only when using unregulated drivers, you could say the heating is higher when using lower Vf emitters, as the increased battery to emitter voltage difference causes higher current flow and power output.
For what it’s worth, in this whole thread, I’ve only read your first post, and the 119th post you suggested.
Post 119 is just a double-down of your original post.
To paraphrase it: “This is why I like my steaks well WELL done”
Your reasoning: “I can just douse it in A1 sauce”
You are mainly talking pictures, and even then it’s a pretty dismal example.
For starters you’re using only what is considered good CRI emitters. 80 CRI is still the beginning of High CRI. I don’t know why you insist on insisting that low CRI emitters wouldn’t make a difference without actually using any in your example, but I digress.
You being able to play with camera settings to get different good CRI emitters to look of similar CRI photo quality on basic colored objects is not really driving your point home like you think it is. The point of taking High CRI photos is so that you can make colors pop even more than normal in radiant ways, I’m sure I don’t have to explain why this is useful. So applying edits to a high CRI photo and a poor CRI photo would absolutely render vastly different results. A high CRI mule photo for instance, will have so much rich color-detailed photo data to start editing from. While the non-CRI photo will basically require repair just to look decent and well-colored. That is by all accounts, a pretty large difference in photo quality.
But forget photo’s. How are you gonna edit real life in real time the whole time your flashlight is on? You’re essentially arguing that 4k HDR TV’s are pointless. We should all have stale blue screens because that will save more energy. Now while there could be some genius to that suggestion, we are humans and we need to experience things in real ways. Two of those things we crave to experience in real ways, are colors and light. We don’t live our lives to be solely efficient (imagine?), we live to enjoy life as fully as we can. Thats why we have badass inefficient V8’s, Turbo’s, NOS, ultra rich 4K TV’s, retina displays and even sunglasses. Efficiency simply isn’t the most important thing in life and it certainly isn in flashlights either. A high CRI flashlight will, simply put, make everyday life feel like 4K HDR (especially the right HIGH CRI emitters). So, I have no clue why you’d insist on saying that seeing everything in it’s most vibrant and colorful self is just an…“inefficient waste of energy”. But to each they’re own I guess.
Riiiiiiight. So the plan is to completely ignore everything I said, and your big winning point is “my brain works better on low CRI”? Well I guess we can’t argue with science you literally made up out of thin air can we? So… :person_facepalming:
Also, your grand proclamation based on that is that “High CRI is not the pinnacle of flashlight marketing”? Now we’re discussing strictly marketing not real world benefits of each emitter? I mean, I knew responding would be a waste but, I guess at least it did serve to prove I was right about not bothering to respond before this so… :+1:
Alot of the gaps are filled in all of the other posts in this thread. People want to make sense of the world around them based on what information they have and what they see or in this case can’t see. If you read the whole thing at the end you will receive a name tag and you will have learned something.
If anyone puts two lights side by side . It's usually very easy to make judgement about which one you like best .
The terms cri baby or tint snob are just offensive for no reason .Everyone is different and no one really cares what tint ,hue or how much you like red or dislike green . People prefer one tint ,emitter or cri over another who cares .Calling someone a snob is just stupid.
I look at High CRI like this .
There is looking
and there is seeing .
Seeing lets you avoid the corner of the coffee table and the LEGO on the floor as you head off to the bathroom at midnight .
Flashlight success.
Looking is like inspecting art or admiring flowers where you want to see every single aspect perfetly.
Our brains are doing a lot of the heavy lifting when cri is poor .
High cri to me is like night and day .Why anyone would want washed out phantom freaky grey or nasty green in monochrome is beyond me .
As much as the CRI gang is trying to dunk on the OP in this thread, I have to inform you folks many your posts are even more cringe.
I bought one of those FA3 SST20 95CRI FW3A’s that many people (one a few posts up) swears by as being one of the cleanest neutral high CRI emitters and I have to say that’s not remotely true. Even at high power levels there’s an obvious yellow cast not too dissimilar to the standard pink glow from a 4500K 219B.
Objective measurements done by contributors like maukka I appreciate but I do not take most people’s takes on CRI too seriously. Most people either don’t seem to know what they’re talking about, don’t know the limitations of CRI, get too religious about it, or seem like they’re just parroting what they read from some torch wiki.
Working eight hours under low cri lighting gives me a headache. Under high cri I smile and get my work done. There’s probably all kinds of studies done about that but alas, I’m too lazy to find them. Must be that mercury vapor lamp at the barn making me lethargic. When it comes to flashlights I get the highest cri I can, and double the lumens over what I think I need. Pretty scientific huh? It really pains me to see all those false colored shopping carts in the parking lot of a mega store. Three cheers for low brow lot lighting.
The cri discussion continues. Good. Keeps the lighting companies on their toes.
Chewing on this a bit and I think maybe there are some fundamentals missing…fundamental understanding of the CRI testing and purpose, and maybe the history as well. It’s a useful tool but it was created when there was little else but incandescent/tungsten and as fluorescent lighting was coming on scene. It was never perfect and that became much more apparent as lighting tech developed and then legit LED rocked the whole boat. The television/video standard rayfish mentioned is pretty specialized but there have been several that were introduced in the last 20-ish years. One that I had not heard of until a few years ago is the TM30…which was either ANSI or the lighting engineer group (whatever they’re called) and is much more complex and revealing. CRI didn’t include the R9 saturated red or any of the other saturated colors, and there wasn’t much above 5000K to compare with the standard sources…so they mathed it out and that wasn’t perfect either.
The biggest thing here is that testing CRI of different color temperatures cannot reveal much and correcting for CCT in a camera image sensor/firmware doesn’t cut it either. Mixing manufacturers doesn’t help either, nor different hosts/optics, but there’s room there for our purposes. We don’t know which emitters these were or their bins/tint, either, which is huge. None of this is easy to test at home even if someone wants to buy/mod a few of the same hosts with emitters that minimize the error/help to standardize the testing. I think it’s also very important that no low-cri emitters are being used here and the term “low” is being misapplied as it’s compared to “high” products. Trying to “separate” CRI is not easy, maybe can’t be done effectively this way at all, but there’s so much more to the whole picture - in emitters as well as our finished flashlight products.
It’s fun to learn and fun to experiment and everyone can use whatever they like. Where there is real value in higher CRI to people, hey great, and where others don’t value it the same, hey that’s great too, just enjoy the hobby or the utility. I think I said it before, but it’s awesome that we even have these choices at all today in the relatively small-dollar niche market of flashlights. The industries who really care about this are who brought it to us and thankfully it trickled down into emitters that we can use. We got more efficiency, more lumens, a range of colors and white temperatures, and even CRI improvements for some. Freakin’ awesome.
Why not? A camera fails to show the difference clearly, so it takes a different instrument, e.g. a spectrometer, giving R values and showing a more complete spectrum for HCRI LEDs. Isn’t a higher Ra already evidence for better color resolution? I think it is, no?
There is no debate here as far as I can tell. It is simple and doesn’t require too much technical info. Here it is for those that still don’t get it.
High CRI is High Definition HDR 4K LED’s
Low CRI is old Square Tube Televisions LED’s
One is beautiful, one is not. End of class.
If you prefer nasty images and don’t care about quality, then just say that. I agree that any light in a pinch is always welcome. But acting like there is no visual quality and color difference between low and high CRI is a delusion through and through.