Okay, an odd one here, but still, it's hardware related
I'm thinking of buying a 4K monitor for my PS4 Pro, and there's quite a large number to choose from, with most of them starting at 27" or 28". The problem there is that I don't have that much space on my PC desk and I do want to enjoy the 4K resolutions certain games (Horizon Zero Dawn to name one) give. I've searched the net a lot and I found a couple of nice looking 24" 4K monitors (ASUS has a real good one in specs) and am looking for input on it.
Main question is if a 24" 4K monitor would be any good for a PS4. I realize that the screen is pretty small and native 4K games would not fully show it's beauty. But with a lot of games being upscaled from 3200x1800 to 4K I wonder how much those games would 'suffer' from the smaller 24" screen.
I also realize that my PC is in no condition to support 4K at the moment. That 1090T is way to old to push the graphics up to 4K and I also wonder is the GTX970 (factory OC) is up for the task as well. I'm wondering if upgrading the CPU to an FX-8350 would solve the 'problem' of 4K on my PC. I'm not yet willing to build a whole new PC because I'm waiting to see what AMD Zen is gonna do the coming years as well as Intel's response to the Zen series...
Any input is appreciated!
Comments
Since you have limited space, can't you mount the screen on a wall instead of having it on your desk?
The GTX970 can run 4K fine but all games do require more resources in 4K so certain high end games might have an issue or will give you a slower FPS at 4K at least.
I run 2K at 27" myself at the moment, there it is a huge improvement from 1080P but it is not worth for me to go 4K, I don't see much difference.
One thing you forgot in my question though is the PS4 Pro part. With most games upscaling from 3K to 4K (and putting the extra Pro power elsewhere), I wonder how much difference that'd make.
Good to know though that a GTX970 can run 4K fine. Too bad my CPU won't be able to handle it though
There are no 4K HDR monitors YET. ASUS has announced them for later this year, though they will be quite expensive (starting at €900 compared to half that for a normal ASUS 4K monitor). Also worth to note that SONY will bring HDR to their 1080p Bravia TVs later this year as well. And as I've read from an article, broadcasters say that people will benefit more from 1080p HDR than from 4K (I've read both after posting my Q here). This might even want me to hold off buying a 4K TV/monitor at all and wait for 1080p HDR and benefit from all the extras the Pro will bring on 1080p60 compared to 4K30 and games like Skyrim and The Last Guardian suffering in performance big-time...
As for what can run 4K, most GFX card can but running in 4K will give you lower FPS then 1080P of course (I don't have a percentage and it probably differs from game to game but if you assume 25% which is high) and can live with that decrease in FPS in a specific game then your card can run that in 4K.
For instance do I get a decrease with about 5% when I run GW2 in 2K compared to 1080P on my old GTX 780 card with graphics maxed out. Not sure how much difference 4K would make for the same game since my screen only max out 2K.
Personally I think that if you want to increase your resolution and have a smaller screen then 30" you should consider 2K as well, it is usually way cheaper, still an upgrade and I have my doubts that you will see the difference between it and 4K.
I myself have a 4 years old Dell U2713 27", the reason for picking it was that it was rather cheap (at the time) but have amazing colors, partially due to it having better backlighting then most other screens. I work somewhat with pictures but having good colors helps in games as well.
In some games the the game is limited by CPU, and you might not lose any FPS because the CPU doesn't really have to work any harder even when you increase resolution. But in most games the FPS loss when increasing resolution is huge.
Here's a nice chart with FPS using different resolutions:
http://www.babeltechreviews.com/rx-480-vs-290x-vs-gtx-1060-amd-neglected-hawaii-35-games-benchmarked/3/
More important than screen size is how large part of your field of view is taken by the monitor: If you're sitting really close to your monitor then even a 24 inch monitor will benefit from 4K, whereas if you're far from your monitor you won't be able to notice any difference.
But using a console controller one normally does not want to attach his nose to the screen. Usually people rather lean back and relax. That would bring you far enough from the monitor that having 4K console game on 24 inch monitor doesn't sound very good idea.
As for the ps4, I noticed no differnce in quailty from going from a 8500 55 inch samsung curved to a 28 inch aoc 4k monitor. You can still tell its 4k. When I had a ps4 pro.
Let me put it this way, the only way your gonna tell 4k, is if you sit within a foot of the screen. In order to tell its 4k, at like 15 feet, you would need like 120 inch tv. So if your sitting between a foot or 2 away from a 28 inch monitor you will be able to tell. if you sit like 2 or 3 feet away, there is 0 reason to buy a 4k 24 inch monitor, you wont be able to tell, infact it will look worse, because the ps4 pro when you go to 4k, shuts off alot of shadows and stuff, to make it playable at 4k.
Also dont let any one convince you, you need a samsung 8500 1700 dollar tv, cause I had, a 6000 6500 7000 7500 8500, and they are all garbage. The all have the same bad back light bleed. The Sharp 43 inch I have now has the same picture quality, and better back light than the 1700 dollar samsungs. The only time its worth going expensive is a Oled. Also HDR is a gimick 100 percent. I tested HDR on the ps4 pro, and it made the picture look way worse, especially on ESO, and i tried all those tvs, plus a LG 4k hdr, a philphs, a tobshia. It makes it look like some one turned the color all the way up, and blurred the screen, it really looks terrible.
Also, @Loke666 I realize that 4K monitors are quite expensive, but the one from ASUS that I saw isn't that overly expensive. It's around €400, which I think isn't overly much for an ASUS monitor at all. That same model in 28" already comes down to roughly €550...
Good luck in your search It took me many 100 mile trips to find the TV i wanted lol, but I am very picky with back light bleed, and I cant afford a 2700 dollar OLED. I would recommend this one.
https://www.walmart.com/ip/Sharp-LC-43N6100U-43-4K-Ultra-HD-2160p-60Hz-LED-Smart-HDTV-4K-x-2K/50855541
It has just as good of a picture as this one. https://www.amazon.com/Samsung-UN65KS8000-65-Inch-Ultra-Smart/dp/B01C5TFLSE?th=1
But costs almost a 1000 less. Mind you the samsungs have better smart features, but if you have a pc and PS4, there really is 0 point for them.
I recommend you buy a hdr from walmart, amazon, or best buy. To make sure you like HDR, so you can return it if you don't. I personally thought it was garbage, and it wasn't just me. I sent pictures to my friend, had my daughter look at it, and my girl friend. Every one said it looked worse. Maybe that is just ESO and Dues ex I dono.
Ahh I just saw that you sit 4 feet away. I am blind so I have to sit like 6 inch from my tvs lol, to many 15 inch crt monitors i suppose haha. Anyways, according to this you would need a 65 inch tv to tell the difference between 4k and 1080p. https://www.cnet.com/news/why-ultra-hd-4k-tvs-are-still-stupid/
This graph is completely opposite from everything told by everyone about 4K resolutions, both in this thread and everywhere. And when looking at my own desk/PC room, a 24" might not be all too bad, though a 50" should be more fitting (according to the graph). That really makes me wonder if 4K is viable at all, and if in my case that 24" (or a 27") monitor would be a good option anyway for my PS4.
Also thanks about the heads up that each game uses 4K differently. Perhaps it's something I still need to think over a bit more, and perhaps (as mentioned before) I should wait and see how those Bravias will look when they come out...
I went with a Samsung (http://www.samsung.com/uk/monitors/uhd-ue590/LU24E590DSEN/), due to my previous poor experience with ASUS. That said, it's likely on a case by case basis.
I have the monitor on my desk, so I'm watching it up close. The difference between 1080p and 4K is very clearly noticeable. It's most noticeable when reading text. To my surprise, I am enjoying the benefits of 4K on text the most - definitely the most underrated reason for switching to 4K. Just browsing the web in general feels amazing (and using 1080p when visiting other computers is a clear downgrade).
The benefits are also clearly visible in games. There is a lot more detail overall. That said, I needed to upgrade from GTX 970 to GTX1080, both due to 4K and VR. The majority of MMOs and 3+ year old games work fine with 4K on GTX970, but more recent games like The Witcher 3 struggled even with 2K on 970.
The one area where I don't personally see a big difference is movies. I am guessing it's due to the rendering - a game in 4K will always be crisper than a movie in 4K. A movie is recorded live, resulting in all sorts of artifacts in the final render.
The biggest nuisance of 4K is user interface compatibility. This is a non-factor with PS4, but a fairly big factor when using a PC. Many older games simply don't support UI upscaling. More interestingly, a lot of production software (Adobe Suite, 3DMax) also struggle with UI upscaling - apparently there are bottlenecks in the way their software is designed, making the jump to 4K UI not straight forward. If you are someone predominantly using production software with the 4K display, I'd advise to check the compatibility first.
It's basically like sitting in the front row of a movie theater. You'll have a more difficult time seeing the detail of the whole picture the closer you are and therefore, lose out on one of the main benefits of 4k. Hence why viewing closer than 6' is not optimal.
The one thing to make sure on your 4k if you will be playing games is input delay. Basically the time from when you hit a button to when you see something happen on the screen. The OLED TVs had this problem until they did a firmware update to fix it.
Just for the record, just because a PS4 Pro can output at 4k, doesn't mean its a 4k gaming system. The GTX 970 you have will run CIRCLES around the GPU in the PS4 Pro. You're much better off sticking to 1080p on that and ejoying the better framerates over the non ps4 pro.
As for your PC, a 970 can run 4k decently well in all but the most demanding games, so, for example, if you want to play Witcher 3 at 4k with maxed settings, you're looking at something more like a 1080 ti for that (and even then just barely).
The 4k meme has been sold to a lot of people and it frankly boggles my mind that so many people buy into it. You can make a much stronger argmument for 4k at PC sitting distances (typically 2-3'), but for TV and normal couch distances 4k is utterly imperceptible to 99.95% of people.
I spend a huge amount of time in the Home Theater world and we have this discussion all the time. About 10 years ago the US Air Force did some visual acuity tests on some of their top fighter pilots, these were guys with like 20/6.5, 20/8, etc, vision, and THEY weren't able to see the pixel density of modern phones and such.
My personal recommendation would be to pick up one of the 144hz IPS Gsync/Freesync gaming panels and call it a day. 1440p is much less taxing on your video card, and looks amazing at normal seating distances for computers. Is 4k at those seating distances going to be objectively better? yes. But we're talking about SERIOUS diminishing returns here. It's not like back in the day when you went from a 1024x768 to a 1920x1080 or something like that, where its this gigantic obvious difference. To tell the difference you would literally have to have them side by side on still images and be examining it closely.
Anyways, my 2 cents.
"The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."
- Friedrich Nietzsche
That is actually far more then I thought, but I only use a 2K myself so I am hardly an expert and most games I play is rather CPU heavy (my CPU is the most high end part in my computer). Heck, I mostly play MMOs and 4X games myself. There is a huge difference between 2K and 4K when we are talking about how much resources they use.
Still, a GTX 970 can run most games even if it loses 60% of the FPS, the problem gets when you get into high end FPS games as I said above.