Honestly, this is probably the worst time to be investing into 4K. Only the most top of the line processors and Graphic Processing Units can scratch what 4K has to offer and most of them are failing to provide fully 60fps experience at full 4K.
- Gaming is about do several quality of life changes
- TV technology is not up to spec
- General Viewing practices
Rasterization is out, Ray Pathing/Tracing is coming in
Rastrerization is a process of taking a 3D image and breaking it down how it would look on the monitor. This process is old. How old? At least 1980s.
This video is super informative and paints a very good picture as to where the video game tech industry is going. It answers a lot of questions as to why the last generation of games has not looked 200-500% better than the last generation. What to expect from consoles in 2020 (because that is year the PS5 will launch)
In the tech world I do not have to tell you that 40+ years of being the standard is beyond ancient. The only reason "underpowered" consoles like the Switch are able to run games 720p 30FPS is by stripping some of the "post-process" effects and just let the game render as it is without the extras beyond the initial process.
To some people that would not be a huge deal "Just buy better X, Y and Z", but the issue is becoming quickly that the way the game is being rendered is actually hurting overall production and not allowing the medium to move forward.
The issue is that Graphic Cards are supposed to accelerate 3D imaging and the rasterization process. But even the most high-end cards are finding diminishing returns.
Graphic Card Manufactures are having issues making their tech down to the 7nm process. It will probably get done but even AMD having the technology to do so is not releasing a card that is going to be dramatically better than the leader on the market. (even though they could) It is because Rasterization requires more memory for all the shadow maps, bump maps, occlusion maps, shading maps.
With ram prices spiked high and then the lack of cheap physical media that can handle the capacity needed to ship games on. It is forcing the gaming industry to think about alternatives. Game developers want a new storage medium because Blue-ray is not cutting it.
Ray-Tracing/Pathing are going to take over as games that used to be 500gb will take considerably less space as the way it is rendered will require less mapping and other supplemental files to run a game.
Which as the video above shows that the 2070 is performing as good as a 2 year old mobile phone processor that cost $150. (probably worth $25-50 now) This has some pretty positive implications to this video, because it shows how half handedly the first RTX cards launched into the market.
Which is bad that NVidia did not invest more into what was the big selling point of the card. But it shows that with more money and a good GPU for $400-$800(1.6-4.3x better GPUs) we literally could see a massive leap in ability if a single company would be willing to take the jump. (20+ GigaRays a second; which is almost to the 50 sample from the video above that is needed to get 50 PathRay Samples to make a ture jump)
Considering technology is not the same as in 2016 and we are going to have another year of progress to be done before 2020. It is conceivable that we will have cards that can hit the correct numbers for PathTracing to take over the market.
Both GPU Manufactures have seen this coming and with the looming notice of a new console generation... it is the perfect place to impose a new rendering standard which will look great but probably will not always hit 4K 60fps.
TV manufactures are terribly cheap
So unless you have the money for the Titan V or the 2080ti you are wasting you money on a format this will not be the standard for the future.
Why can I say that without a doubt?
Because TV manufactures are cheap and will scrub out anything that provides them more overhead. 60hrz has been a standard for a very long time, it will probably still be a very long time before it gets changed. Mainly due to the fact that Gaming technology for consoles have set that standard and don't generally budge from it as limiting the refresh to 60fps is a great middle ground between performance and over all looks
TV manufactures want to make you buy a TV with 120htz-240hrz
120htz does not help with anything that is rendered below 120 htz, yes frame interpellation can "upgrade" by using algorithms and other stuff but it generally only adds frame lag/tearing and not nessasarly makes the game better. On the PC end rendering things at 120htz an provide a small upgrade.
But then again we go back to how the best graphic cards are having issues running at 60fps... which would mean 120fps is out of the question at his point without sacrificing resolution. Which again would point to why the need for a screen that targets the PPI of 4K.
120-motion rate is a failure in itself. The screen has to be updated double that of the signal to provide the correct frame pacing at 60hrz. It is that on a technological level they have cheeped out on how the device projects so that they have to double the refresh rate to hit it accurately...
Ultra Blue-ray only support 60hrz as the maximum. So why waste your money?
$300-500 for a 4K TV might be tempting
But the devil is in the details... Which most manufactures won't give you.
It is hilarious how terrible the information is on consumer TVs. They basically took a section that is supposed to be a tell-all its here or it isn't and managed to make things so vague that you might end up with an 8bit panel that is not HDR 10 compatible. Which the only upside would be the resolution that also could be substandard variant that stretches the pixels just a little bit so they can save on cost.
Granted it is 10bit and will support the home HDR format HDR10, but beyond that there are no 100% definitive specs of what this TV performs at. No listing for nits, no actual number on colors, It claims to be 4K Cinema HDR but DCI standard is 12bit... through SDI not... It claims to support Dolby Vision but I have yet to be shown any specs on their own website that would provide that is would pass 100% certified device... but rather simply support the wider 12bit gamut and try to match it on a 10bit display.
And that is what you get for $800.
I don't know why retailer can get away with such blatant mismatching of actual standards with rubbish terminology that has almost nothing to describe the quality of the product they are selling.
Most of the formats are irrelevant to what you want to do.
Gaming as we know has been locked down between 30hrz and 60 hrz, and generally on 720p last generation and 900-1080p this generation. Even most games on the PS4 pro are up scaling in one from or another from 1080p to 4K but not nessasarly always nailing it. Even the Xbox 1 X needs dynamic scaling from lessor resolutions to keep 60fps.
It is very similar to how 1080i was back in the day. Provided half of the data rate, but allowed 60hrz resolution at 1080 which did provide some slightly better sharpness at the cost of quality.
Most games have been designed from the ground up for 8bit, and moving to 10bit without any updates just makes games look flat. Because most developers have to save space somewhere and if colors were always 8bit samples they did not need to go beyond that till now. Which means games are going to balloon again...
There is more HDR support this generation, but again it probably won't be till 2020 before it becomes a standard. As when the new video game consoles push back against the "if it ant broke" mentality.
Generally speaking if your TV supports "millions of colors" it is an 8bit display. Most of the cheaper 4K TVs are just 8bit and will be outdated as the whole industry has moved forward slowly. Ultra Blue-ray for how backward is seems supports 10-bit media and it probably will be the finally nail in the coffin of 8-bit color sampling.
So you could by a 4K TV now, but you probably would have to upgrade it just to see a difference in a year or two as most consoles will support a larger gamut and have a much more immersive experience that is just not possible 8-bit.
As a video editor it is night and day the differences and the amount of subtle changes that can be made.
I hope you don't think I am hating on 4K
It is not that 4K are a bad resolution, and if you have the money and the proper hardware you certainly are getting a wonderfully higher resolution.
But right now the technology to support TVs like this right now is... Subpar. But there are some good uses for 4K right now.
1) Huge screen.
If you needed to project a game on a screen that is larger than 25" and that you will not be sitting closer than 10' from. You probably would get a nice boost in quality without giving up too much. Certainly if you are in a theater I would have to say do 4K or nothing as any screen larger than 60" will look terrible with 1080p stretched unless you are sitting in the back of the theater.
Totally understand the whole 30 vs. 60 argument in some games... but when the screen is the size of a small ranch house... it kind of takes a back seat to resolution.
2) Far sighted.
If you for some reason are naturally far sighted and cannot stand stetting next to the screen with your glasses and need a reason to relax and just enjoy things as you are. This would also be a good reason to get a 4K TV right now.
Here is a quick guide for specs if you are looking for a 4K TV right now:
Does it have billions of colors? (aka is it 10bit? ) *Millions of colors are 8bit* **64billion is 12bit**
Does it have at least 1000 nits of brightness? (minimum requirement for HDR10..., which TVs can support, it without hitting the correct numbers... So go by the minimum so that you get 100% true performance)