That depends on what you mean. If your question is whether you can plug in a 4K monitor, set the resolution to 4K, and have it work as opposed to the monitor displaying a distorted image, no image at all, or blue screening, then yes, your hardware almost certainly does support 4K.
If the question is whether you can get most games to run well at 4K if you're willing to turn down whatever other settings you need to, then again, the answer is yes. Bad UI scaling will be a bigger impediment to a good 4K experience than insufficiently powerful hardware.
If the question is whether you can run all games at 4K resolution and max all other settings, then no. Your hardware isn't powerful enough for that. And neither is anyone else's, though a GeForce RTX 2080 Ti would get you much closer to it than the hardware you have.
I have a 980 and I run games in 4K. I do have to turn AA and other details down - Medium or High instead of Ultra. I’m also ok with a game dipping under 60 occasionally (but not under 30)
As quiz says: performance isn’t the bad part about 4K, scaling is. And your card is faster than mine by a good bit.
From my experience, you can play most games before 2015 with that setup maxed at 4K. You can also do so with a lot of games after 2016 which aren't resource heavy. For more modern titles you will have to reduce settings or the resolution. Most MMOs will be playable maxed at 4K resolution. I do recommend turning off AA entirely with a 4K monitor. It's completely pointless and causes more issues than it fixes. You will also have UI scaling issues for games that don't use vector based UI.
You are talking about 8 million pixels per frame verse 2 million pixels per frame with each pixel needing about 20~60 calculations. Considering the card can do 110,000,000,000 float32 calculations per frame it's all dependent on how good the game programmer is.
You are talking about 8 million pixels per frame verse 2 million pixels per frame with each pixel needing about 20~60 calculations. Considering the card can do 110,000,000,000 float32 calculations per frame it's all dependent on how good the game programmer is.
First off, you dropped two zeroes. Second, even then, the number is wrong, as the card is rated at about 8 TFLOPS, not 11. A 1080 Ti is closer to 11. And even the TFLOPS number is kind of dubious, as it assumes all FMA, which counts as two operations for whatever reason. I think it's more illuminating to cut the number in half and get how many instructions it can do per second with a lot of other instructions that can be used.
Performance is as much about how much work the programmer is trying to do as how good he is at optimizing it. And you can turn down the amount of work that the game is trying to do by turning down some settings.
Computations per pixel is kind of dubious for a whole lot of reasons. For starters, a game is likely to have a lot of different pixel/fragment shaders, some of which do a lot more work than others. Only one of the five programmable pipeline stages operates on pixels directly, though it does tend to be the most computationally heavy one and by far. And there are plenty of reasons why the number of pixel/fragment shader invocations per pixel per frame often won't be exactly one, from transparency to post-processing effects to one object being in front of another.
No, mine was right. I divided by 60 since it's a per frame. For 4k resolution, the pixel shader step will be where the biggest hit is. I was also extrapolating to the worst case scenario, and making a general statement that a lot is dependent on the programmer for these shaders.
Comments
I'd imagine the 1070 TI could get around 30 in some cases.
As someone who prefers playing at consistently 60 FPS or higher, I don't consider my 1080 TI to be enough.
I'm not a pro on hardware, but just my two cents..
If the question is whether you can get most games to run well at 4K if you're willing to turn down whatever other settings you need to, then again, the answer is yes. Bad UI scaling will be a bigger impediment to a good 4K experience than insufficiently powerful hardware.
If the question is whether you can run all games at 4K resolution and max all other settings, then no. Your hardware isn't powerful enough for that. And neither is anyone else's, though a GeForce RTX 2080 Ti would get you much closer to it than the hardware you have.
As quiz says: performance isn’t the bad part about 4K, scaling is. And your card is faster than mine by a good bit.
4K for text and web browsing is very very nice.
But it's not fast enough that you should buy a new 4K monitor to replace a 1080p monitor unless the new monitor is also otherwise better.
I do recommend turning off AA entirely with a 4K monitor. It's completely pointless and causes more issues than it fixes. You will also have UI scaling issues for games that don't use vector based UI.
You are talking about 8 million pixels per frame verse 2 million pixels per frame with each pixel needing about 20~60 calculations. Considering the card can do 110,000,000,000 float32 calculations per frame it's all dependent on how good the game programmer is.
Performance is as much about how much work the programmer is trying to do as how good he is at optimizing it. And you can turn down the amount of work that the game is trying to do by turning down some settings.
Computations per pixel is kind of dubious for a whole lot of reasons. For starters, a game is likely to have a lot of different pixel/fragment shaders, some of which do a lot more work than others. Only one of the five programmable pipeline stages operates on pixels directly, though it does tend to be the most computationally heavy one and by far. And there are plenty of reasons why the number of pixel/fragment shader invocations per pixel per frame often won't be exactly one, from transparency to post-processing effects to one object being in front of another.