Anybody seen any 4k G-Sync monitors on sale? Looking to upgrade from my BenQ XL2420T I purchased quite some time ago. Size doesn't really matter.. 24-28 or so.
Or if you can convince me otherwise, any 4k monitor lol.
When all is said and done, more is always said than done.
Comments
no.
¯\_(ツ)_/¯
¯\_(ツ)_/¯
거북이는 목을 내밀 때 안 움직입니다
As someone who doesn't do editing, I'll take a 120-144mhz 2k monitor over a 60-100mhz 4k any day of the week.
It's also weird when folks are talking about 4K displays for gaming when developers are barely optimizing the games to run smoothly ultra at 1080p.
Folks have to start understanding that 2K/4K/8K resolutions only became a thing because of the growing size of viewing spaces and the people creating content wanting to have crisp visuals. All of this has to do with the size of your display relative to where your eyes are in relation to the screen, and being able to see the sharpness of lines and edges of details. This means if your gaming space is 27 - 32 in and you're sitting directly in front of the screen, 2K is the sweet spot. That is unless you're a creator who wants more visible work space when dealing with larger sized content.
Outside of gaming the irony is, 4K content looks better on a 2K screen because of pixel density etc. Filmographers typically shoot double the resolution of their expected output and shrink it down for this reason. If you want to see the reverse in motion, watch your local news at 4K and check the difference between the studio shots + on-air graphics against some of the footage they pull from other sources.
Movies are filmed at 24fps, your local news 60fps, and sporting events 120fps+. The reason for the higher frame rates is to remove motion blur. This gives the content a more realistic, looking through a window feel. Slow motion looks much clearer.
With all that in mind, for PC gaming I'd assume the look we'd all graphically want is the 120fps, Monday Night Football look. The problem is 120fps isn't some nominal benchmark that technology has made mainstream and moved on from. 60fps is still an issue on the consumer market. Exponentially growing the viewing resolution when display sizes will have a hard cap only makes managing frame rates harder.
This is why when I see people going on about ray tracing on the GTX 2080 I chuckle, because they're creating a new consumer market issue while the outstanding ones are still there.
TL;DR
Bigger numbers does not mean Better Experience.
Pardon the rant. I'm display shopping too
¯\_(ツ)_/¯
As for 4k 144Hz monitors on sale ... I haven't run across any yet. It would surprise me to see these on any significant sale, as they still tend to be on the much higher-end scale.
First, let's talk about pixels and inches. Right now, you have a 24", 1920x1080 monitor. That's about 92 pixels per inch. If you get a 4K monitor that is also only 24", that will be about 184 pixels per inch. Something that is 100 pixels across is more than an inch now, but will be barely half an inch on the new monitor.
Some programs scale well to whatever monitor resolution you're using, but some don't. If you get too many pixels and not enough inches, the interface in some programs--including both games and non-gaming software--will be so tiny as to be a pain to use. I'd be cautious before going that route.
Next, let's talk about pixels and refresh rates. The more pixels you have to draw, the more work it is to draw each frame, and that tends to lead to lower frame rates. But you probably already knew that. And up to a point, you can buy higher frame rates by throwing a heftier video card at it.
But that's not the only trade-off. You can only push pixels through a monitor cable so fast, and that can mean lower refresh rates than you might like at higher resolutions. A given monitor might be able to do 4K at 30 Hz, 2560x1440 at 60 Hz, or 1920x1080 at 120 Hz. A newer monitor than that might be able to do 4K at 60 Hz or 2560x1440 at 144 Hz.
There are apparently monitors that can do 4K at 144 Hz, but they cost a fortune. I haven't looked into it, so I'm not sure if it's just the latest and greatest version of DisplayPort or if there's more special sauce involved like needing two monitor cables that get tied together into a single image.
And just because a monitor can do something doesn't necessarily mean that your video card can. I'd advise reading the documentation very carefully and making sure that the monitor says that it can do the refresh rate you want at the resolution you want using a particular version of a particular protocol. And then make sure that your video card supports that same version of the same protocol. For example, the 144 Hz monitors I have support both DisplayPort and HDMI, but can only do 60 Hz if using HDMI. If you get a 4K monitor that can only do 4K at 30 Hz, you're probably not going to be happy with it. For that matter, if you're used to 144 Hz, you might not be happy with 4K at 60 Hz, either.
There's also the matter of image quality. A lot of gaming monitors that go for high resolutions and high frame rates use a TN panel with image quality that is pretty bad, but does allow a quicker response time to get a new image on the screen a few milliseconds faster. That's what your current monitor did, so you might be used to it. But some do use an IPS panel, which tends to give much better image quality. Maybe you care about that and maybe you don't, but personally, I want an IPS panel.
And then there is the issue of refresh rates and adaptive sync. The higher a monitor's refresh rate, the less adaptive sync matters. If something just misses 144 Hz and so it drops to 72 Hz, oh well. If it just misses 60 Hz and so drops to 30 Hz, that's bad. For adaptive sync to raise the latter to 50 Hz is a lot more valuable than raising the former to 100 Hz.
Supporting adaptive sync adds basically nothing to the cost of a good quality monitor. FreeSync is AMD's implementation of adaptive sync, plus a little extra sauce, but really, for a company building a good quality monitor to support FreeSync adds basically nothing to the cost. Intel doesn't support adaptive sync yet, but says that they will. Because it's an industry standard, anyone can support it who wants to.
G-sync is not Nvidia's implementation of adaptive sync; it's a proprietary way to do about the same thing for the sake of breaking compatibility with the industry standard. In order to support G-sync, a monitor vendor has to buy a special module from Nvidia for about $100. By the time various parties take their markup, that adds about $150 to the retail cost of a monitor. That's why on average, if a monitor vendor makes two identical monitors, one of which supports FreeSync and the other G-sync, the latter typically costs about $150 more for the same thing. That's not monitor vendors trying to gouge you; they're just passing along the cost that Nvidia charges them.
Right now, Nvidia owns the high end of video cards, so they can get away with it to some extent. What happens when they don't? AMD basically starved their GPU division of resources for a few years to focus on CPUs and stave off bankruptcy. But Ryzen and EPYC have saved the company, and they're no longer starving their GPU division. What happens when AMD GPUs are competitive with Nvidia at the high end again? Think Nvidia will want to put themselves at a $150 price disadvantage by refusing to support adaptive sync? Or do you think they'll decide to support adaptive sync--and ultimately drop support for G-sync?
Speculation is speculative, but I think that for them to never support adaptive sync will eventually be suicidal. At that point, Nvidia can fix the problem by deciding to support adaptive sync. If that happens, adaptive sync monitors will be strongly preferable to G-sync monitors, as they'll be supported on everything, while G-sync will only be supported on Nvidia GPUs, and only until Nvidia decides to pull the plug on support for long discontinued products. And that's in addition to the adaptive sync monitor having cost $150 less up front.
Or what happens if AMD has a clearly superior product at some point? This could mean AMD dominating the high end, or merely offering the same performance as Nvidia for much less money. These things go back and forth, and AMD has been ahead in enough ways at enough times in the past that I'd bet that they will be again at some point in the future. Do you still buy Nvidia at that point, or would you instead wish that you hadn't gone with a G-sync monitor? If you're a hard-core Nvidia fanboy, then maybe you buy Nvidia anyway (or just skip that generation and wait for Nvidia to be ahead again before buying your next GPU), but that's not the ideal situation for most people.
Its a fair comparison when you're looking for a deal.
I'm all for higher refresh rates for gaming but manufacturers have resolution and refresh rates married to each other and one is the antithesis of the other due to current PC hardware limitation.
Because of consumer ignorance/confusion and marketing to it, manufacturers are discarding 1080p, damn near skipping 2K, in favor of 4K while not allowing folks to optimize their viewing space at their leisure. If a consumer is doing 27-32 inch at 16:9 for entertainment purposes only, 4K spits in the face of trying to get the best visual/atmospheric experience... for what?
¯\_(ツ)_/¯
So.. it seems like the majority of people are for 2k, not 4k. And probably adaptive sync instead of g-sync. Interesting.
What video card are you using?
Ultra wide is the most immersive addition iv added when it comes to gaming. Especially if you are a MMO player that extra real estate on screen is something to see in games like WOW,FFARR,GW2 ect ect. Its awesome. Don't go 28 inch, go 30+ Ultra wide seems smaller for some reason.
You can get a decent 2k Ultra Wide for under a grand. Sounds like budget isn't a concern for you though. The suggestion of the Predator with Gsync would be my choice as well.
Aloha Mr Hand !
In most cases, "ultrawide" is better described as short than wide, though at 34", you can argue that it isn't actually that short. The problem is that for nearly everything you do on a computer monitor, even at 16:9, you run out of vertical space before horizontal. That includes web browsing like you're doing right now; practically every site there is uses the sides just for ads because there's no other real use for it, and wider only means more ads or more blank space. It also includes e-mail, word processing, or any other sort of reading or writing text, as reading very long lines is awkward, but taller lets you fit more lines.
And running out of vertical space first also includes most games. I once went back and looked through all the games I had to see if I could find any candidates that would have been better at 21:9 than 16:9 if the game had supported it. The only ones I could come up with were Tecmo Super Bowl and Uniracers. Old console games like that don't support adjustable resolutions, anyway.
I have been told that the extra width is nice for first person shooters, which are a genre that I ignore. I could believe that, as it gets you a little more peripheral vision, though still not very much. I could also believe that the extra width is nice if you zoom way in so that even if you're not quite in first person perspective, you're awfully close to it. But if the problem is that you can't see very well because you've zoomed in too far, the easier solution is to zoom out.
That said, what matters is what works for the way you use the computer, not the way someone else does. If playing first person shooters is 90% of what you do on a computer, then an ultra wide monitor makes a lot more sense for you than it does for me.
Personally, I have three of these:
https://www.newegg.com/Product/Product.aspx?Item=N82E16824236466
They're in portrait mode, for a combined resolution of 4320x2560. The combined diagonal measurement is 47", and the 27:16 aspect ratio is a little closer to square than 16:9. The bezels don't bother me, though I could understand why some people wouldn't like them. And the $1600 total price tag also rules out the configuration for most people. But it was a way to get 144 Hz at higher than 4K more than three years ago.
You can get plenty high frame rates at high resolutions if you're willing to turn down settings. Your video card can handle just about any game at 4K and 60 frames per second or higher if you're willing to turn down settings. If you insist on max settings for everything, some games are going to choke at 4K.
What makes sense for you depends tremendously on your preferences. I'm not saying that the monitor you linked is bad. I'm saying that it makes sense for some people and not others. Some people would see 60 Hz or TN as a deal-breaker, while others would see them as not a big deal--and an acceptable price to get a 4K, 28" monitor without breaking the bank.
I'd personally lean toward 2560x1440 because that makes it much easier to get plenty of inches so as not to make everything tiny, and 144 Hz if you want it.
I have 27" IPS 4K monitors right now (Dell P2715Q). I love the extra PPI for browsing the web, for editing stuff, and just general use. But as far as gaming goes - 4K isn't a large benefit over, say, 1080 with nice anti-aliasing. I picked them up because one of my 24' 1920x1200 monitors was starting to dim/yellow - it was 12 years old, so it had a good life. I needed something, nothing was available that ticks all my boxes, so I picked these up for pretty cheap to use until that very special monitor comes along one day....
What I would really like to see is a consistent HDR implementation on the PC. On the PS4, HDR can be a dramatic improvement (it's a per-developer thing, some are better at it than others). I've never been terribly sensitive to FPS dips, but VRR (Freesync/Gsync) would go a long way to help with tearing.
My personal priorities are:
- Physical size: It has to fit on my desk. I like twin monitors side by side, my workflow and gaming habits have more or less grown around that. That's a physical limitation I'm not willing to work around. I have 2x 27" right now, and I couldn't go much larger. I might could support a pair of 32", or doing something strange like a 32" in landscape and a 24" next to it in portrait or something. I don't think I would care for a single 46", three monitors in portrait mode, or an ultrawide form factor. This is a personal preference thing though.
- Sharp, bright colors: OLED is ~the best~, IPS right now is a distant second. I would consider a good VA, but TN isn't an option for me
- Crisp Image: This is why I chose 4k over high refresh rate, for the extra PPI. I run at about 150% scaling. Not everything in Windows scales well, but the stuff that I use most does.
- VRR: I currently have an nVidia card, I am not willing to pay extra for Gsync though, especially when I see the same monitor with Freesync often going for $200-$300 less
- HDR: This actually is a higher priority for me on screens in general, but since Windows is absolutely abhorrent in HDR support, it's lower on the list for a PC monitor.
- >60Hz Refresh
I would gladly fork over $2000 for 27" 4K GSync 120Hz HDR monitors, ~if~ they were OLED. Monitors are an item that I don't upgrade often, and I have got a lot of years out of all my previous monitors, so I'm ok with paying for something that should last me for a few generations of hardware. I'd feel a whole lot better about it if we could get to one standard implementation of VRR, and I think the industry will move that way with HDMI 2.1, it just isn't there yet.
That being said, I also don't game on my PC anywhere near as often as I used to. As a result, I find myself more forgiving of my current hardware, and much more selective when I do decide to finally upgrade something.