True Color Rendition (TCR)..........

See watt I mean? Over-tedious pedantry. :student: :open_mouth:

Btw one doesn’t “need” to do sheeiiitttt. Not harpin’ ya personally. Just that this techno- blab stuff has been gettin’ carried away regarding CRI vs the real world of uses vs personal tint tastes. All where the ultimate aim from a personal perspective is, …….

“Do I see the three primary colors and its range of complimentaries in a fashion well enuff that I am pleased with watt I’m illuminating back to my very unique eyes”. :laughing:

How ya “gauge” that from a human uniqueness perspective shouldn’t “need” to be made over-complex.

>Watt

Excellent

You’re discounting how relative human vision is. Shining the same light, at the same object(s) can look completely different for me based on time of day/ambient lighting. TM-30 was created to address the many serious downsides of the older “CRI” rating:

Long video, but if anyone’s bored it’s an amazing source of info.

Geez all this has “forced” me to add TCR’s definition to my signature. I’m doomed. :laughing: :open_mouth:

What I’m trying to express is that based on time of day, actual lux level of that sunlight vs the flashlight, etc. you will never be able to judge that based on your own vision alone. That’s why we have objective measurements of color rendition, CCT, etc.

Set a fixed color balance on your camera, take pics at 6am, 12pm, 5pm and 8pm and compare them. The very same light has looked angry blue in some situations, neutral temp but green in others, and VERY warm and rosy at dusk on a partly cloudy day to me.

The way we perceive color temp is VERY dependent on lux, this is explained in the Kruithof Curve.

Ok.

However I was also going to add another facet to TCR. Our eyes being uniquely different can fine-tune TCR through numbers. By that I mean for instance if my wife views the same Picasso in sunlight and then later illuminates it with the given flash do her and me come up with a pretty close TCR? Then I add my kids. Then my neighbor.

If all of us get a consensus that the particular flash’s TCR is between 8.2 and 9.3 then average them for a TCR of let’s say 8.7 then I could comfortably say that that’s a reasonably accurate TCR.

So if someone should ask me, “Yeah Notta, you’re touting this flash butt watt’s the TCR?”

I could confidently say the TCR for me is 8.5. Consensus average with others was 8.7. That should give someone else a reasonably accurate idea of how the flash performed within those parameters.

If ya add tint to all that, well IMO because that tends to be a very personal choice, then consequently TCR purposely avoids addressing it.

PS. Nothing of course stops you from comparing the LED manufacturer’s CRI for that particular bin, etc., to TCR. I would probably just tell you to be honest that I considered the TCR for me to be 8.5 and leave it at that. Butt I also personally felt that the tint was a bit over-yellowed for my tastes and consequently you might wanna keep that in mind, or not. Ya might prefer yellow tints who knows.

I think most here becuz of their experience know when a flash TCR is ridiculous. A red apple in typical sunlight when flashed looks like a rotten decomposing eggplant.

TCR? “1.0 Mr. Blutarsky.” :laughing:

PSS. I just used decimal points butt let’s get real. Who’s gonna accurately differentiate between a .5 and .7? I could go for .5 increments though to have some flex. Could some flash get a 10? Possible I guess.

PSSS. Ya know I always liked giving 99.99% sure when someone really tries to pin me down on a crucial answer. Then if I’m somehow completely off the carrot truck I can always go back, “Yeah butt I did say only 99.99% sure.” :laughing: :open_mouth: :+1: :beer:

Don’t get me wrong. I understand and appreciate your point, but if you want to know what you’re getting before you have it in hand to test it according to the nottawhackjob TCR scale, you will be dependent on other metrics, and CRI covers only one of the possible factors.

Being the devils advocate here -If our eyes can’t tell the difference why worry about it?

Nevermind my last question. iamlucky13 just answered it

Not saying that our eyes/brain can’t tell the difference, it’s that the subjective experience of it is completely dependent on other variables.

A 3000K light looks terrible compared to sunlight on an overcast or even sunny day at noon, but looks fantastic (and can be objectively accurate) in a gallery.

Any “rating” doesn’t mean much if it changes by walking 10ft into another room IMHO.

I get that we can see differences in the bigger changes (e.g. colour temp) but as you keep pointing out with so many variables that can even change perception of colour temp why bother considering cri? Walking 10ft past another light source is going to render it useless for perfect perception. It just sounds like chasing ghosts to me.

Taking a step back, I do have to remind myself that other people care about it more than I do and if a ‘better’ choice is available people will choose it. The problem for me is ‘Better’ seems to be subject to machines and graphs that tell us the differences since everyone sees something different not to mention everyone has different preferences.

I can understand lucky’s point that if you know your own preferences then the graphs can help narrow the next choice.

Yes I agree. Butt TCR isn’t shooting for that level of scientifically supportable data. The sole “data” it’s providing is one person’s scalable subjective perception of how true an object’s color rendition becomes when illuminated. Watt I’m also in effect saying here is that the more a person ‘knows’ the object’s color(s) the more accurate TCR will tend to be.

All subjective butt loosely scalable. The person on the other end will have to decide whether to trust the TCR rating and likely will depend on the person’s reputation, etc.

CRI in combination with TCR are really two tools enhancing one another.

I didn’t mean that it should replace CRI even partially in the decision process. Someone for instance could trust your quick answer TCR because they know your reputation for being savvy especially when it comes to this area of expertise. Simple answer for simple answer-requesting folks as it were.

I look at a particular LED’s bin, efficiency, Vf, etc., and most importantly watt others are trend- saying about its real-world performance. If they say it makes colors accurately “pop” but shows a slightly green cast on lighter whitish backgrounds I’d probably pass.

The logic of that seems backwards to me. You pick color temp and tint based on preference, but how well that renders colors is shown with the raw numbers rather what looks “good”. OP seems to be discounting the actual data completely, especially since I’m not aware of a single emitter where dedoming/slicing improves the Ra by a significant amount. If anything it’s probably just a point in the column for lower duv looking “better” to the vast majority of people.

I could be leaning too far into the semantic, but “True Color Rendering” couldn’t be more of misnomer given that for example tint mixes lower color accuracy but look “better” to many people.

Just imagine judging lumen output on perceived brightness only. I wasn’t around for it but I think there have been some heated discussions of people having their own idea of what “a lumen” constitutes here on BLFand/ or the other forum.

With reviewers like Maukka showing angular tint shit and changes in duv/temp with drive current… I don’t think there’s much that the data misses TBH.

[quote=nottawhackjob]

That’s just an opinion on how likeable the tint is. Sunlight itself has a positive duv which is why there has been discussion of moving away from the BBL on ANSI charts.

A metric means nothing without a standard, and since there are so many variables for what someone “knows” about how colors should be rendered ist’s effectively meaningless. What someone thinks of as “sunlight” will be completely different for me in the winter vs a person living on the equator for example.

I don’t have any issue with someone saying “This light’s beam/tint is good/okay/terrible.” but assigning some arbitrary number value to that makes no sense. Making colors “pop” is what we have Rg for, and green tint on white walls in duv.

I think yer over-analyzing.

Nothing can replace watt your brain perceives. I don’t care how many data points and graphs ya have. However a trending subjective observational consensus can also be just as ‘accurate’ as the raw scientific data shows.

Pretty much everyone agrees CRI as an initial starting point is the best place to start - for some. And all I’m saying is I’ve kinda gotten tired of solely relying on it so why not try analyzing watt I see for myself because I know a particular object so well then why not TCR rate it?

PS. I can tell ya haven’t de-domed much. I can emphatically tell ya TCR can change substantially doing so. :laughing:

“A metric means nothing without a standard, and since there are so many variables for what someone “knows” about how colors should be rendered ist’s effectively meaningless.”

This is patently absurd. Very simply if I know exactly how a particular blue brush stroke looks on a painting then flash it and it now looks greenish the TCR on that flash is gonna be pretty low.

Remember too it’s not just the LED CRI at play here.

There’s the reflector’s/aspheric lens composition, the lens whether AR or not, the driver’s output/battery condition, etc.

All this will determine within a reasonable consensus whether or not that particular flash deserves a higher TCR than not. Not to mention each particular flash in the run WILL perform differently.

TCR does nothing more than make CRI raw data somewhat more approachable and comparable via real world results.

You don’t. All you see is reflected light, which is composed differently depending on the light source. Then your brain does something like an automatic white balance.

Lay off the German version of LSD. Watt is that, Jagermeister? :laughing: :open_mouth: :beer:

Yer over-analyzing.

Jägermeister might have an effect on how reflected light is perceived, too. If it shines through the bottle, it’s green.

After consuming too much of it, colors are less important than brightness. You’d want a moon-mode then.

If i understand correctly raw numbers does not equal eye comfort. OP places a higher priority on visual comfort but BurningPlayd0h places a higher priority on raw numbers… and yet others claim high cri is most comfortable for them.

I wouldn’t have rolls of Zircon filters and pour over spectral tests (duv is one those measurements…) if I didn’t think eye comfort was important. Wrapping up all that change in one rating that has little to do with color rendition (which may not change at all with shifts in tint/temp) doesn’t make much sense.

This is why beamshots and measurements exist, so we can have more info than “It looks right now, it’s brighter now” etc. :slight_smile:

I think a thread of visual illusions due to retinal persistence, etc. would actually be really useful info. Still amazes me how much my perception can shift on lights and those are a great demo of why it happens.