There are some TVs and monitors that are 8K, but perhaps 8K is overkill (and they’re probably overpriced as well.)
My vision is pretty good (if I wear my glasses), but I honestly cannot notice any improvement from 30fps to anything higher.
I hear that, and I kinda agree that 4k is plenty. I don’t know why 8k tvs are a thing when 8k content does not exist on the market. Plus, my TV (even my old one) looks better than the theater I go to. The theater looks washed out to me now.
Also, the color grading on Gladiator 2 was a crime lmao. Why is everything so grey??? It was like watching a video game from 2002
I posted the wrong video link and just now corrected it. A very good video that went over all the settings and helped me very much at the beginning. The guy doing the video, Classy Tech, is considered one the leading calibrators here in the US.
His recommendation for settings are followed by a lot of people on AVS Forum and posted on page 1 of that thread. AVS Forum is where there are a lot of nutty videophiles. As crazy about video as we are here about flashlights. And THAT… is pretty crazy.
I had a brain fart and meant to say Un-Crush the blacks instead of unblock the blacks. Which, CE, I know you understand. For others who may not be familiar, here’s my version of what’s going on. A simple explanation from a simple mind {me}.
Glad you went for the 65, easy to get used to the bigger size. In a month you’ll look at the old set and think “we used to watch that tiny thing?”
Crushed blacks happen when you can’t see the difference between full black and the next few lighter values.
For example a character comes out in dark set wearing a black jacket or suit. Any detail in the cloth is just not there. Hence the term crushed blacks. The low values are crushed into being not discernible.
This is often seen when watching in viewing rooms that are not really (really) dark – ambient light wise. Like most un-dedicated HT rooms - like my living room.
Guy comes out in suit - it’s all black until all the viewing room lights are off. Then, if the settings are right, you can see the details in the fabric patterns.
I hate this and in years past would put up with not full blacks to prevent this.
Many true black lovers live/lived with crushed blacks when when watching in rooms with even moderate lighting. All for the need to see true black bars framing the image.
I hold y-tube reviewers responsible for this - blacks at all cost.
With the B9 I can get this in a brighter room.
All the Best, Jeff
I guess I was a little obtuse, but I thought I’d ask just in case there was a specific (and interesting) problem with blacks on the B9.
I still need to investigate black levels on the A95L. I’m a little unsure of how far to go in reigning in blacks since realistically, some things are probably supposed to be obscured by shadow.
There is a scene in Pacific Rim near the beginning that takes place on a landing pad. It’s overcast and someone is wearing a black outfit. On my last TV they were more or less just an outline. I’m hoping that the new TV will be able to do this scene more justice.
4K is really nice. The B9 is my first 4K that isn’t a monitor.
Don’t see a big difference in some shows. Source?
Was watching a 4K show interviewing some guy with long white hair.
Each loose hair was so clear against the dark background.
I was just amazed how much better it looked than a 1080 version.
TV size and viewing distance are part of part of the equation.
All the Best, Jeff
The LG B9 4K looks amazing. 55" is $1600… Has there been a general pricing trend downward on this model?
I am no expert on video technicals (but decent with “judging” from many years wasting time with TV’s and photography) but if you’re really picky then maybe a pro calibrator would help. My JVC projector was calibrated and although I still use the more “exciting” modes for fun, the calibrated picture is absolutely spot-on target in similar visual aspects that we spent hours talking about here on BLF. Skin tone especially just spot on, warm and not overly blue like in nearly every other mode that the projector has, film/cinema/natural, etc.
I don’t plan to have my Sony OLED calibrated though. The color rendering just seems so right and supposedly this is one advantage of Sony A95L. Less need for calibration, relatively.
Yes, the out-of-the-box color accuracy of the A95L is pretty amazing. The main things to probably check are shadow details and red levels (which tend to be elevated with QLED I hear)
TV calibration is such an odd industry. Do the professionals use measuring equipment? Or is it all just eyeballing test patterns?
They use specific sensors on the screen along with software to detect contrast , light levels and balance color. They do it per input.
It can take a few hours. Lots of retailers offer the basic service, but purists tend to use established calibrators with good reputations.
Differences can be dramatic. Though initially some feel the light levels are too low. Most love the PQ once the get used to it. If you have a ~$2K (0r more) display and plan to keep it for a few years (and consider yourself a videophile) , I would recommend it highly.
You can buy the basic tools and do it yourself, but it will likely cost more than a good calibrator. Plus there is the learning curve. So if you don’t plan to do multiple displays, it is probably not worth it.
Example… on the cheap end of things…
When I google TV calibration the first thing that pops up is Best Buy. Yeah right!
I found this list though. Maybe I’ll see how much they want.
take a look here:
https://www.avsforum.com/threads/isf-calibrators-where-are-you-located-please-post-here.586330/
Lots of pages, but some good info if you have time to look through it.
I would look into motion interpolation settings and things like judder, black frame insertion, etc.
I dont know enough to dive deep into these things but IMHO a lot of folks who feel that 4k is “too much” dont realize the TV is doing a lot of background processing. I think motion interpolation is the worst of this. I like high frame rate content, i dont mind the picture of actual soap operas, but i CANNOT STAND motion interpolation, it makes me feel sick.
Point being, especially with 240hz screens, it may not be resolution that creates that feeling.
They use very expensive equipments and sensors. There’s no Light Master here . They talk about equipments, so and so OLED must have so and so wavelength or whatever technical babbles, that I could never understand.
You know when you’re at a certain forum full off nutty OCD people and a couple professional posters are treated like they’re God and talk like they are God? Discuss technicals openly and no one dares to question them? Those two guys AFAIK and IMHO are DNice and ClassyTech in AVS Forum. They possibly make tour around the country to calibrate but you have to check.
These people are the real thing and calibrate professionally. I personally would use them were I to do it. My projector was calibrated by someone similar to these guys. It took him a few hours and the cost reasonable, around $400-500 IIRC.
Well, $400-500 is nothing to sneeze at but I’ll grant you it’s more reasonable in the context of a $3000k TV
Given that different sources may look different (different Blu-ray players for example), wouldn’t the calibration of the TV be potentially more or less accurate depending on what you’re watching? I also noticed that Rtings recommended a different temperature preset depending on SDR and HDR. Do professional calibrators make this distinction or do they focus entirely on HDR?
They will calibrate for both HDR and SDR as well as for each source that you have available(at least by TV input selection). It is a comprehensive process. But once it is calibrated it will be accurate for whatever you watch. Regardless of whether it is BD, Streaming, OTA TV or whatever. The calibrator will give you any info you need about whether you need to change any settings for different modes.
Re. cost, agreed. That’s partially why I’m not doing it. Especially for this latest generation of LG and Sony, the “need” for calibration is, arguably, less.
Just to add to Mandrake’s response on how it’s done. For example in my JVC projector, in addition to 3 original settings (natural, cine, film), now appear new settings SDR, SDR bright, HDR, etc., and all 3 do skin tone correctly. The original 3 settings are “exciting” but overly blue, and crushes white by a significant margin (contrast is cranked to make things look good). It’s a VERY involving process using professional equipments. They go into the factory menu to change settings.
Among pro reviewers (not the calibrator) and truly dedicated/crazy hobbyists, you won’t find much argument against pro calibration, even for our Sony A95L which is considered among the very best out of the box. The only question is whether the improvement is worth it. For me and for now , it’s not because the quality is already so spectacular.
Well… I self calibrated my LG OLED, which at the time was considerd quite good out of the box. After I got done I couldn’t imagine it getting better. I had a pro do it. I was amazed at the difference. But it is all about what is important to one, I guess. Though at this point with the changes in my disposable income, I would think long and hard about it today.
This depends on angular resolution, which in turn depends on viewing distance and screen size. There is a good article on the topic here. Because so much in the field of audio-visual gear marketing is based on pseudo science, I have fact checked some of the foundational details for this article against academic research on human vision, which was the same type of research that Apple relied on when they first introduced their “Retina” displays:
It was eye opening for me back when I first bought an HDTV, and the Bluray of Jurassic Park to go with it. To see how much improvement there was, I swapped back and forth a few times between the Bluray and the DVD and…well…there was a difference, but it was a bit underwhelming. That started me learning more about the topic, and doing things like simulating resolution changes downscaling and upscaling images.
I want to add that there are other factors that affect the sense of detail in an image besides the actual angular resolution. Increasing edge acuity via sharpening techniques or even just increasing contrast can give an impression that resolution is increased, even though there might not actually be more detail. So to sell 4K TV’s, often those settings get cranked up to create this impression. The result can be the image seeming “hyper real,” for lack of a better term, which is eye-catching, but can also seem ultimately artificial or even distracting.
In fact, these sorts of techniques were occurring to a lesser degree even with my 480 vs 1080 testing of Jurassic Park. At a minimum, the upscaling that the industry has very good interpolation techniques for was preventing a sense of pixelation, and there may have been some sharpening going on, as well.
All things considered, 4K can add some benefit with displays positioned to fill a large portion of your field of view, but it’s not a major difference unless you’re sitting unusually close.
8K goes beyond the point at which there is a benefit on screens where you can perceive the whole screen meaningfully without turning your head. My understanding is the primary interest in 8K is not for displays, but for the cameras, because it enables to content creators to shoot a scene wider than they might otherwise, and select the final framing when they edit, which gives them more leeway in the shooting and editing steps.
I think you should often be able to notice an increase in detail at 4K unless you are sitting unreasonably far away or are using a smaller screen (sub 55"). However, the difference in detail may be less pronounced depending on the films in question.
Many early blu-rays come from subpar transfers and/or have comparatively low bitrates. On the other hand, some films will not look much better on 4k with the best possible scan and bitrate due to being filmed on 16mm rather than 35mm film or being finished at 2k despite some elements being of a higher resolution.
The bottom line is that the question, “Is 4K worth it?” needs to be answered individually for each movie. Some 4k disks are actually worse than the preceding blu-ray. (Pirates of the Carribean). Other times, the 4K is the first time the film has had a high-quality transfer done.
And of course, all of this ignores the questions of wide color gamut and High Dynamic range–I’m just focusing on the possibility of increased detail.