Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

4k G-Sync monitors on sale?

OhhPaigeyOhhPaigey Member RarePosts: 1,517
edited November 2018 in Hardware
Anybody seen any 4k G-Sync monitors on sale? Looking to upgrade from my BenQ XL2420T I purchased quite some time ago. Size doesn't really matter.. 24-28 or so.

Or if you can convince me otherwise, any 4k monitor lol.
When all is said and done, more is always said than done.
«1

Comments

  • RidelynnRidelynn Member EpicPosts: 7,383
    Hahahahaha.

    no.
    Grunty
  • FlyByKnightFlyByKnight Member EpicPosts: 3,967
    Cyber Monday is your day. Before then B&H is having a Black Friday sale and I saw a couple of G-Syncs on sale there.
    "As far as the forum code of conduct, I would think it's a bit outdated and in need of a refre *CLOSED*" 

    ¯\_(ツ)_/¯
  • FlyByKnightFlyByKnight Member EpicPosts: 3,967
    DMKano said:
    Unless you are ready to drop $1500+ - good luck finding something cheaper for a good 4k 144hz Gsync


    What's a good G-sync and why does it HAVE to be 144hz?
    "As far as the forum code of conduct, I would think it's a bit outdated and in need of a refre *CLOSED*" 

    ¯\_(ツ)_/¯
  • The user and all related content has been deleted.

    거북이는 목을 내밀 때 안 움직입니다












  • EponyxDamorEponyxDamor Member RarePosts: 749
    edited November 2018
    Personally I have no interest in 4k other than my home tv
    I game in 2k (1440p) and love it

    Personally, I find refresh rate to make more of a difference than resolution in gaming. However, for some forms of editing a 4K monitor provides better results. So, mileage may vary from person to person, depending on what they do on their desktop.

    As someone who doesn't do editing, I'll take a 120-144mhz 2k monitor over a 60-100mhz 4k any day of the week.
    laseritOzmodan
  • FlyByKnightFlyByKnight Member EpicPosts: 3,967
    It's weird in a time where PC master race is about "ultra settings or bust", that at the same time I see people going so hard for high refresh rates. What card setups are folks running to get 144 fps and better with no dips below on the majority of games?

    It's also weird when folks are talking about 4K displays for gaming when developers are barely optimizing the games to run smoothly ultra at 1080p.

    Folks have to start understanding that 2K/4K/8K resolutions only became a thing because of the growing size of viewing spaces and the people creating content wanting to have crisp visuals.  All of this has to do with the size of your display relative to where your eyes are in relation to the screen, and being able to see the sharpness of lines and edges of details. This means if your gaming space is 27 - 32 in and you're sitting directly in front of the screen, 2K is the sweet spot. That is unless you're a creator who wants more visible work space when dealing with larger sized content.

    Outside of gaming the irony is, 4K content looks better on a 2K screen because of pixel density etc. Filmographers typically shoot double the resolution of their expected output and shrink it down for this reason. If you want to see the reverse in motion, watch your local news at 4K and check the difference between the studio shots + on-air graphics against some of the footage they pull from other sources.

    Movies are filmed at 24fps, your local news 60fps, and sporting events 120fps+. The reason for the higher frame rates is to remove motion blur. This gives the content a more realistic, looking through a window feel. Slow motion looks much clearer.

    With all that in mind, for PC gaming I'd assume the look we'd all graphically want is the 120fps, Monday Night Football look. The problem is 120fps isn't some nominal benchmark that technology has made mainstream and moved on from. 60fps is still an issue on the consumer market. Exponentially growing the viewing resolution when display sizes will have a hard cap only makes managing frame rates harder.

    This is why when I see people going on about ray tracing on the GTX 2080 I chuckle, because they're creating a new consumer market issue while the outstanding ones are still there.

    TL;DR
    Bigger numbers does not mean Better Experience.

    Pardon the rant. I'm display shopping too
    EponyxDamorAsm0deusOzmodanultimateduck
    "As far as the forum code of conduct, I would think it's a bit outdated and in need of a refre *CLOSED*" 

    ¯\_(ツ)_/¯
  • EponyxDamorEponyxDamor Member RarePosts: 749
    edited November 2018
    Thought I'd share ... Newegg's eBay shop has the AW3418DW 34" 120Hz G-Sync 3440x1440 (2k ultra-wide) IPS monitor on sale for $720 (originally $1,500, cause alienware). Compared to monitors with similar specs, such as the Acer x34P ($850) with identical panel/specs, its not a bad price. Hoping for better deals on Monday, but if I don't find one, I'm happy with this one.

    As for 4k 144Hz monitors on sale ... I haven't run across any yet. It would surprise me to see these on any significant sale, as they still tend to be on the much higher-end scale.
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    It's weird in a time where PC master race is about "ultra settings or bust", that at the same time I see people going so hard for high refresh rates. What card setups are folks running to get 144 fps and better with no dips below on the majority of games?

    It's also weird when folks are talking about 4K displays for gaming when developers are barely optimizing the games to run smoothly ultra at 1080p.

    Folks have to start understanding that 2K/4K/8K resolutions only became a thing because of the growing size of viewing spaces and the people creating content wanting to have crisp visuals.  All of this has to do with the size of your display relative to where your eyes are in relation to the screen, and being able to see the sharpness of lines and edges of details. This means if your gaming space is 27 - 32 in and you're sitting directly in front of the screen, 2K is the sweet spot. That is unless you're a creator who wants more visible work space when dealing with larger sized content.

    Outside of gaming the irony is, 4K content looks better on a 2K screen because of pixel density etc. Filmographers typically shoot double the resolution of their expected output and shrink it down for this reason. If you want to see the reverse in motion, watch your local news at 4K and check the difference between the studio shots + on-air graphics against some of the footage they pull from other sources.

    Movies are filmed at 24fps, your local news 60fps, and sporting events 120fps+. The reason for the higher frame rates is to remove motion blur. This gives the content a more realistic, looking through a window feel. Slow motion looks much clearer.

    With all that in mind, for PC gaming I'd assume the look we'd all graphically want is the 120fps, Monday Night Football look. The problem is 120fps isn't some nominal benchmark that technology has made mainstream and moved on from. 60fps is still an issue on the consumer market. Exponentially growing the viewing resolution when display sizes will have a hard cap only makes managing frame rates harder.

    This is why when I see people going on about ray tracing on the GTX 2080 I chuckle, because they're creating a new consumer market issue while the outstanding ones are still there.

    TL;DR
    Bigger numbers does not mean Better Experience.

    Pardon the rant. I'm display shopping too
    What makes you think that the people pushing ultra settings or bust are the same people as pushing 144 Hz?  I turn a lot of settings down or off, but I like my 144 Hz at 4320x2560 resolution.  Even if I "only" get 100 frames per second in a game, that's going to display better on a 144 Hz monitor than it will on a 60 Hz monitor.
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    OhhPaigey said:
    Anybody seen any 4k G-Sync monitors on sale? Looking to upgrade from my BenQ XL2420T I purchased quite some time ago. Size doesn't really matter.. 24-28 or so.

    Or if you can convince me otherwise, any 4k monitor lol.
    Let's back up a bit.  There are trade-offs that you may or may not realize, and it's not clear what your preferences are.

    First, let's talk about pixels and inches.  Right now, you have a 24", 1920x1080 monitor.  That's about 92 pixels per inch.  If you get a 4K monitor that is also only 24", that will be about 184 pixels per inch.  Something that is 100 pixels across is more than an inch now, but will be barely half an inch on the new monitor.

    Some programs scale well to whatever monitor resolution you're using, but some don't.  If you get too many pixels and not enough inches, the interface in some programs--including both games and non-gaming software--will be so tiny as to be a pain to use.  I'd be cautious before going that route.

    Next, let's talk about pixels and refresh rates.  The more pixels you have to draw, the more work it is to draw each frame, and that tends to lead to lower frame rates.  But you probably already knew that.  And up to a point, you can buy higher frame rates by throwing a heftier video card at it.

    But that's not the only trade-off.  You can only push pixels through a monitor cable so fast, and that can mean lower refresh rates than you might like at higher resolutions.  A given monitor might be able to do 4K at 30 Hz, 2560x1440 at 60 Hz, or 1920x1080 at 120 Hz.  A newer monitor than that might be able to do 4K at 60 Hz or 2560x1440 at 144 Hz.

    There are apparently monitors that can do 4K at 144 Hz, but they cost a fortune.  I haven't looked into it, so I'm not sure if it's just the latest and greatest version of DisplayPort or if there's more special sauce involved like needing two monitor cables that get tied together into a single image.

    And just because a monitor can do something doesn't necessarily mean that your video card can.  I'd advise reading the documentation very carefully and making sure that the monitor says that it can do the refresh rate you want at the resolution you want using a particular version of a particular protocol.  And then make sure that your video card supports that same version of the same protocol.  For example, the 144 Hz monitors I have support both DisplayPort and HDMI, but can only do 60 Hz if using HDMI.  If you get a 4K monitor that can only do 4K at 30 Hz, you're probably not going to be happy with it.  For that matter, if you're used to 144 Hz, you might not be happy with 4K at 60 Hz, either.

    There's also the matter of image quality.  A lot of gaming monitors that go for high resolutions and high frame rates use a TN panel with image quality that is pretty bad, but does allow a quicker response time to get a new image on the screen a few milliseconds faster.  That's what your current monitor did, so you might be used to it.  But some do use an IPS panel, which tends to give much better image quality.  Maybe you care about that and maybe you don't, but personally, I want an IPS panel.

    And then there is the issue of refresh rates and adaptive sync.  The higher a monitor's refresh rate, the less adaptive sync matters.  If something just misses 144 Hz and so it drops to 72 Hz, oh well.  If it just misses 60 Hz and so drops to 30 Hz, that's bad.  For adaptive sync to raise the latter to 50 Hz is a lot more valuable than raising the former to 100 Hz.

    Supporting adaptive sync adds basically nothing to the cost of a good quality monitor.  FreeSync is AMD's implementation of adaptive sync, plus a little extra sauce, but really, for a company building a good quality monitor to support FreeSync adds basically nothing to the cost.  Intel doesn't support adaptive sync yet, but says that they will.  Because it's an industry standard, anyone can support it who wants to.

    G-sync is not Nvidia's implementation of adaptive sync; it's a proprietary way to do about the same thing for the sake of breaking compatibility with the industry standard.  In order to support G-sync, a monitor vendor has to buy a special module from Nvidia for about $100.  By the time various parties take their markup, that adds about $150 to the retail cost of a monitor.  That's why on average, if a monitor vendor makes two identical monitors, one of which supports FreeSync and the other G-sync, the latter typically costs about $150 more for the same thing.  That's not monitor vendors trying to gouge you; they're just passing along the cost that Nvidia charges them.

    Right now, Nvidia owns the high end of video cards, so they can get away with it to some extent.  What happens when they don't?  AMD basically starved their GPU division of resources for a few years to focus on CPUs and stave off bankruptcy.  But Ryzen and EPYC have saved the company, and they're no longer starving their GPU division.  What happens when AMD GPUs are competitive with Nvidia at the high end again?  Think Nvidia will want to put themselves at a $150 price disadvantage by refusing to support adaptive sync?  Or do you think they'll decide to support adaptive sync--and ultimately drop support for G-sync?

    Speculation is speculative, but I think that for them to never support adaptive sync will eventually be suicidal.  At that point, Nvidia can fix the problem by deciding to support adaptive sync.  If that happens, adaptive sync monitors will be strongly preferable to G-sync monitors, as they'll be supported on everything, while G-sync will only be supported on Nvidia GPUs, and only until Nvidia decides to pull the plug on support for long discontinued products.  And that's in addition to the adaptive sync monitor having cost $150 less up front.

    Or what happens if AMD has a clearly superior product at some point?  This could mean AMD dominating the high end, or merely offering the same performance as Nvidia for much less money.  These things go back and forth, and AMD has been ahead in enough ways at enough times in the past that I'd bet that they will be again at some point in the future.  Do you still buy Nvidia at that point, or would you instead wish that you hadn't gone with a G-sync monitor?  If you're a hard-core Nvidia fanboy, then maybe you buy Nvidia anyway (or just skip that generation and wait for Nvidia to be ahead again before buying your next GPU), but that's not the ideal situation for most people.
    OhhPaigeymmolou
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Thought I'd share ... Newegg's eBay shop has the AW3418DW 34" 120Hz G-Sync 3440x1440 (2k ultra-wide) IPS monitor on sale for $720 (originally $1,500, cause alienware). Compared to monitors with similar specs, such as the Acer x34P ($850) with identical panel/specs, its not a bad price. Hoping for better deals on Monday, but if I don't find one, I'm happy with this one.

    As for 4k 144Hz monitors on sale ... I haven't run across any yet. It would surprise me to see these on any significant sale, as they still tend to be on the much higher-end scale.
    That's not 4K.  And I don't particularly care about a price comparison to monitors with similar specs when the specs on it are stupid for a computer monitor.
    Ozmodan
  • EponyxDamorEponyxDamor Member RarePosts: 749
    Quizzical said:
    Thought I'd share ... Newegg's eBay shop has the AW3418DW 34" 120Hz G-Sync 3440x1440 (2k ultra-wide) IPS monitor on sale for $720 (originally $1,500, cause alienware). Compared to monitors with similar specs, such as the Acer x34P ($850) with identical panel/specs, its not a bad price. Hoping for better deals on Monday, but if I don't find one, I'm happy with this one.

    As for 4k 144Hz monitors on sale ... I haven't run across any yet. It would surprise me to see these on any significant sale, as they still tend to be on the much higher-end scale.
    That's not 4K.  And I don't particularly care about a price comparison to monitors with similar specs when the specs on it are stupid for a computer monitor.
    I didn't say it was 4k.

    Its a fair comparison when you're looking for a deal.
    Ozmodan
  • MikehaMikeha Member EpicPosts: 9,196
    Let me know if you find a deal on a 4K Free Sync TV with HDR 10 and Dolby Vision. ;)
    EponyxDamorRidelynn
  • FlyByKnightFlyByKnight Member EpicPosts: 3,967
    Quizzical said:
    It's weird in a time where PC master race is about "ultra settings or bust", that at the same time I see people going so hard for high refresh rates. What card setups are folks running to get 144 fps and better with no dips below on the majority of games?

    It's also weird when folks are talking about 4K displays for gaming when developers are barely optimizing the games to run smoothly ultra at 1080p.

    Folks have to start understanding that 2K/4K/8K resolutions only became a thing because of the growing size of viewing spaces and the people creating content wanting to have crisp visuals.  All of this has to do with the size of your display relative to where your eyes are in relation to the screen, and being able to see the sharpness of lines and edges of details. This means if your gaming space is 27 - 32 in and you're sitting directly in front of the screen, 2K is the sweet spot. That is unless you're a creator who wants more visible work space when dealing with larger sized content.

    Outside of gaming the irony is, 4K content looks better on a 2K screen because of pixel density etc. Filmographers typically shoot double the resolution of their expected output and shrink it down for this reason. If you want to see the reverse in motion, watch your local news at 4K and check the difference between the studio shots + on-air graphics against some of the footage they pull from other sources.

    Movies are filmed at 24fps, your local news 60fps, and sporting events 120fps+. The reason for the higher frame rates is to remove motion blur. This gives the content a more realistic, looking through a window feel. Slow motion looks much clearer.

    With all that in mind, for PC gaming I'd assume the look we'd all graphically want is the 120fps, Monday Night Football look. The problem is 120fps isn't some nominal benchmark that technology has made mainstream and moved on from. 60fps is still an issue on the consumer market. Exponentially growing the viewing resolution when display sizes will have a hard cap only makes managing frame rates harder.

    This is why when I see people going on about ray tracing on the GTX 2080 I chuckle, because they're creating a new consumer market issue while the outstanding ones are still there.

    TL;DR
    Bigger numbers does not mean Better Experience.

    Pardon the rant. I'm display shopping too
    What makes you think that the people pushing ultra settings or bust are the same people as pushing 144 Hz?  I turn a lot of settings down or off, but I like my 144 Hz at 4320x2560 resolution.  Even if I "only" get 100 frames per second in a game, that's going to display better on a 144 Hz monitor than it will on a 60 Hz monitor.
    I didn't correlate the 2, I said it's weird during the time of ultra or bust that it's a thing.

    I'm all for higher refresh rates for gaming but manufacturers have resolution and refresh rates married to each other and one is the antithesis of the other due to current PC hardware limitation.

    Because of consumer ignorance/confusion and marketing to it, manufacturers are discarding 1080p, damn near skipping 2K, in favor of 4K while not allowing folks to optimize their viewing space at their leisure. If a consumer is doing 27-32 inch at 16:9 for entertainment purposes only, 4K spits in the face of trying to get the best visual/atmospheric experience... for what?
    "As far as the forum code of conduct, I would think it's a bit outdated and in need of a refre *CLOSED*" 

    ¯\_(ツ)_/¯
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    edited November 2018

    I'm all for higher refresh rates for gaming but manufacturers have resolution and refresh rates married to each other and one is the antithesis of the other due to current PC hardware limitation.
    High resolution, high frame rate, ultra settings:  choose any two.  I don't need ultra settings.  Selectively turn down enough demanding settings and I can get my high frame rates and high resolutions simultaneously.  And because a lot of the most demanding things make a game look different but not really better, graphics don't necessarily have to suffer that much, unless you're into rasterized lighting effects.  Which I'm not.
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited November 2018
    Quizzical said:

    I'm all for higher refresh rates for gaming but manufacturers have resolution and refresh rates married to each other and one is the antithesis of the other due to current PC hardware limitation.
    High resolution, high frame rate, ultra settings:  choose any two. 
    The problem is, no matter which two you pick, you'll have half a dozen deplorables who have to tell you how your wrong. At least 2 of those 6 will claim they have all three as well. And by then, it will have devolved into yet another Green vs Red fanboy free-for-all.
    EponyxDamorOhhPaigeyQuizzicalOzmodanultimateduck
  • OhhPaigeyOhhPaigey Member RarePosts: 1,517
    edited November 2018
    Quizzical said:
    OhhPaigey said:
    Anybody seen any 4k G-Sync monitors on sale? Looking to upgrade from my BenQ XL2420T I purchased quite some time ago. Size doesn't really matter.. 24-28 or so.

    Or if you can convince me otherwise, any 4k monitor lol.
    Let's back up a bit.  There are trade-offs that you may or may not realize, and it's not clear what your preferences are.

    First, let's talk about pixels and inches.  Right now, you have a 24", 1920x1080 monitor.  That's about 92 pixels per inch.  If you get a 4K monitor that is also only 24", that will be about 184 pixels per inch.  Something that is 100 pixels across is more than an inch now, but will be barely half an inch on the new monitor.

    Some programs scale well to whatever monitor resolution you're using, but some don't.  If you get too many pixels and not enough inches, the interface in some programs--including both games and non-gaming software--will be so tiny as to be a pain to use.  I'd be cautious before going that route.

    Next, let's talk about pixels and refresh rates.  The more pixels you have to draw, the more work it is to draw each frame, and that tends to lead to lower frame rates.  But you probably already knew that.  And up to a point, you can buy higher frame rates by throwing a heftier video card at it.

    But that's not the only trade-off.  You can only push pixels through a monitor cable so fast, and that can mean lower refresh rates than you might like at higher resolutions.  A given monitor might be able to do 4K at 30 Hz, 2560x1440 at 60 Hz, or 1920x1080 at 120 Hz.  A newer monitor than that might be able to do 4K at 60 Hz or 2560x1440 at 144 Hz.

    There are apparently monitors that can do 4K at 144 Hz, but they cost a fortune.  I haven't looked into it, so I'm not sure if it's just the latest and greatest version of DisplayPort or if there's more special sauce involved like needing two monitor cables that get tied together into a single image.

    And just because a monitor can do something doesn't necessarily mean that your video card can.  I'd advise reading the documentation very carefully and making sure that the monitor says that it can do the refresh rate you want at the resolution you want using a particular version of a particular protocol.  And then make sure that your video card supports that same version of the same protocol.  For example, the 144 Hz monitors I have support both DisplayPort and HDMI, but can only do 60 Hz if using HDMI.  If you get a 4K monitor that can only do 4K at 30 Hz, you're probably not going to be happy with it.  For that matter, if you're used to 144 Hz, you might not be happy with 4K at 60 Hz, either.

    There's also the matter of image quality.  A lot of gaming monitors that go for high resolutions and high frame rates use a TN panel with image quality that is pretty bad, but does allow a quicker response time to get a new image on the screen a few milliseconds faster.  That's what your current monitor did, so you might be used to it.  But some do use an IPS panel, which tends to give much better image quality.  Maybe you care about that and maybe you don't, but personally, I want an IPS panel.

    And then there is the issue of refresh rates and adaptive sync.  The higher a monitor's refresh rate, the less adaptive sync matters.  If something just misses 144 Hz and so it drops to 72 Hz, oh well.  If it just misses 60 Hz and so drops to 30 Hz, that's bad.  For adaptive sync to raise the latter to 50 Hz is a lot more valuable than raising the former to 100 Hz.

    Supporting adaptive sync adds basically nothing to the cost of a good quality monitor.  FreeSync is AMD's implementation of adaptive sync, plus a little extra sauce, but really, for a company building a good quality monitor to support FreeSync adds basically nothing to the cost.  Intel doesn't support adaptive sync yet, but says that they will.  Because it's an industry standard, anyone can support it who wants to.

    G-sync is not Nvidia's implementation of adaptive sync; it's a proprietary way to do about the same thing for the sake of breaking compatibility with the industry standard.  In order to support G-sync, a monitor vendor has to buy a special module from Nvidia for about $100.  By the time various parties take their markup, that adds about $150 to the retail cost of a monitor.  That's why on average, if a monitor vendor makes two identical monitors, one of which supports FreeSync and the other G-sync, the latter typically costs about $150 more for the same thing.  That's not monitor vendors trying to gouge you; they're just passing along the cost that Nvidia charges them.

    Right now, Nvidia owns the high end of video cards, so they can get away with it to some extent.  What happens when they don't?  AMD basically starved their GPU division of resources for a few years to focus on CPUs and stave off bankruptcy.  But Ryzen and EPYC have saved the company, and they're no longer starving their GPU division.  What happens when AMD GPUs are competitive with Nvidia at the high end again?  Think Nvidia will want to put themselves at a $150 price disadvantage by refusing to support adaptive sync?  Or do you think they'll decide to support adaptive sync--and ultimately drop support for G-sync?

    Speculation is speculative, but I think that for them to never support adaptive sync will eventually be suicidal.  At that point, Nvidia can fix the problem by deciding to support adaptive sync.  If that happens, adaptive sync monitors will be strongly preferable to G-sync monitors, as they'll be supported on everything, while G-sync will only be supported on Nvidia GPUs, and only until Nvidia decides to pull the plug on support for long discontinued products.  And that's in addition to the adaptive sync monitor having cost $150 less up front.

    Or what happens if AMD has a clearly superior product at some point?  This could mean AMD dominating the high end, or merely offering the same performance as Nvidia for much less money.  These things go back and forth, and AMD has been ahead in enough ways at enough times in the past that I'd bet that they will be again at some point in the future.  Do you still buy Nvidia at that point, or would you instead wish that you hadn't gone with a G-sync monitor?  If you're a hard-core Nvidia fanboy, then maybe you buy Nvidia anyway (or just skip that generation and wait for Nvidia to be ahead again before buying your next GPU), but that's not the ideal situation for most people.
    I love your posts, thanks for the info. :)

    So.. it seems like the majority of people are for 2k, not 4k. And probably adaptive sync instead of g-sync. Interesting.
    Ozmodan
    When all is said and done, more is always said than done.
  • OhhPaigeyOhhPaigey Member RarePosts: 1,517
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    OhhPaigey said:
    Be warned that it's a TN monitor, and that its refresh rate is 60 Hz.  28" as 4K will also be much smaller pixels than you're used to.  But getting rid of any of those drawbacks would require a much larger budget, so it's a question of priorities.

    What video card are you using?
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Ridelynn said:
    Quizzical said:

    I'm all for higher refresh rates for gaming but manufacturers have resolution and refresh rates married to each other and one is the antithesis of the other due to current PC hardware limitation.
    High resolution, high frame rate, ultra settings:  choose any two. 
    The problem is, no matter which two you pick, you'll have half a dozen deplorables who have to tell you how your wrong. At least 2 of those 6 will claim they have all three as well. And by then, it will have devolved into yet another Green vs Red fanboy free-for-all.
    While that is true, if you're the one using a monitor, what matters is how well you like it.  It doesn't matter how well random people on the Internet who don't have to use it think they would have liked it.
    Ridelynn
  • OhhPaigeyOhhPaigey Member RarePosts: 1,517
    Quizzical said:
    OhhPaigey said:
    Be warned that it's a TN monitor, and that its refresh rate is 60 Hz.  28" as 4K will also be much smaller pixels than you're used to.  But getting rid of any of those drawbacks would require a much larger budget, so it's a question of priorities.

    What video card are you using?
    I have a 1080TI. How much would a good monitor cost? $1000+?
    When all is said and done, more is always said than done.
  • JeffSpicoliJeffSpicoli Member EpicPosts: 2,849
    OhhPaigey said:
    Quizzical said:
    OhhPaigey said:
    Be warned that it's a TN monitor, and that its refresh rate is 60 Hz.  28" as 4K will also be much smaller pixels than you're used to.  But getting rid of any of those drawbacks would require a much larger budget, so it's a question of priorities.

    What video card are you using?
    I have a 1080TI. How much would a good monitor cost? $1000+?
    I have a 980ti paired with a LG Ultra wide at 144hz, I debated back and forth between 4k or 2k but ultimately having the extra FPS (100+) was more important to me than 4k at say 40-60 fps. 

     Ultra wide is the most immersive addition iv added when it comes to gaming. Especially if you are a MMO player that extra real estate on screen is something to see in games like WOW,FFARR,GW2 ect ect. Its awesome. Don't go 28 inch, go 30+ Ultra wide seems smaller for some reason.

     You can get a decent 2k Ultra Wide for under a grand. Sounds like budget isn't a concern for you though. The suggestion of the Predator with Gsync would be my choice as well.
    OhhPaigeyultimateduck
    • Aloha Mr Hand ! 

  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    thunderC said:
    OhhPaigey said:
    Quizzical said:
    OhhPaigey said:
    Be warned that it's a TN monitor, and that its refresh rate is 60 Hz.  28" as 4K will also be much smaller pixels than you're used to.  But getting rid of any of those drawbacks would require a much larger budget, so it's a question of priorities.

    What video card are you using?
    I have a 1080TI. How much would a good monitor cost? $1000+?
    I have a 980ti paired with a LG Ultra wide at 144hz, I debated back and forth between 4k or 2k but ultimately having the extra FPS (100+) was more important to me than 4k at say 40-60 fps. 

     Ultra wide is the most immersive addition iv added when it comes to gaming. Especially if you are a MMO player that extra real estate on screen is something to see in games like WOW,FFARR,GW2 ect ect. Its awesome. Don't go 28 inch, go 30+ Ultra wide seems smaller for some reason.

     You can get a decent 2k Ultra Wide for under a grand. Sounds like budget isn't a concern for you though. The suggestion of the Predator with Gsync would be my choice as well.
    A 34" monitor with a 21:9 aspect ratio is basically a 27" monitor with a 16:9 aspect ratio, plus some extra stuff on the sides.  The two monitors would be basically the same in the vertical direction with respect to both pixels and inches.

    In most cases, "ultrawide" is better described as short than wide, though at 34", you can argue that it isn't actually that short.  The problem is that for nearly everything you do on a computer monitor, even at 16:9, you run out of vertical space before horizontal.  That includes web browsing like you're doing right now; practically every site there is uses the sides just for ads because there's no other real use for it, and wider only means more ads or more blank space.  It also includes e-mail, word processing, or any other sort of reading or writing text, as reading very long lines is awkward, but taller lets you fit more lines.

    And running out of vertical space first also includes most games.  I once went back and looked through all the games I had to see if I could find any candidates that would have been better at 21:9 than 16:9 if the game had supported it.  The only ones I could come up with were Tecmo Super Bowl and Uniracers.  Old console games like that don't support adjustable resolutions, anyway.

    I have been told that the extra width is nice for first person shooters, which are a genre that I ignore.  I could believe that, as it gets you a little more peripheral vision, though still not very much.  I could also believe that the extra width is nice if you zoom way in so that even if you're not quite in first person perspective, you're awfully close to it.  But if the problem is that you can't see very well because you've zoomed in too far, the easier solution is to zoom out.

    That said, what matters is what works for the way you use the computer, not the way someone else does.  If playing first person shooters is 90% of what you do on a computer, then an ultra wide monitor makes a lot more sense for you than it does for me.

    Personally, I have three of these:

    https://www.newegg.com/Product/Product.aspx?Item=N82E16824236466

    They're in portrait mode, for a combined resolution of 4320x2560.  The combined diagonal measurement is 47", and the 27:16 aspect ratio is a little closer to square than 16:9.  The bezels don't bother me, though I could understand why some people wouldn't like them.  And the $1600 total price tag also rules out the configuration for most people.  But it was a way to get 144 Hz at higher than 4K more than three years ago.
    OhhPaigey
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    OhhPaigey said:
    Quizzical said:
    OhhPaigey said:
    Be warned that it's a TN monitor, and that its refresh rate is 60 Hz.  28" as 4K will also be much smaller pixels than you're used to.  But getting rid of any of those drawbacks would require a much larger budget, so it's a question of priorities.

    What video card are you using?
    I have a 1080TI. How much would a good monitor cost? $1000+?
    You've got a plenty powerful enough GPU, then.  I had to ask because you never know when someone will come in hoping to do 4K gaming on a $100 video card.

    You can get plenty high frame rates at high resolutions if you're willing to turn down settings.  Your video card can handle just about any game at 4K and 60 frames per second or higher if you're willing to turn down settings.  If you insist on max settings for everything, some games are going to choke at 4K.

    What makes sense for you depends tremendously on your preferences.  I'm not saying that the monitor you linked is bad.  I'm saying that it makes sense for some people and not others.  Some people would see 60 Hz or TN as a deal-breaker, while others would see them as not a big deal--and an acceptable price to get a 4K, 28" monitor without breaking the bank.

    I'd personally lean toward 2560x1440 because that makes it much easier to get plenty of inches so as not to make everything tiny, and 144 Hz if you want it.
    OhhPaigeyEponyxDamor
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited November 2018
    I will say, for the purposes of gaming, I'd rather have refresh rate/FPS than resolution. VRR is, right now, a binary thing. If you own nVidia,  you have to use GSync. If you own AMD, you have to use Freesync. If you use Intel... well... I feel sorry for you. 

    I have 27" IPS 4K monitors right now (Dell P2715Q). I love the extra PPI for browsing the web, for editing stuff, and just general use. But as far as gaming goes - 4K isn't a large benefit over, say, 1080 with nice anti-aliasing. I picked them up because one of my 24' 1920x1200 monitors was starting to dim/yellow - it was 12 years old, so it had a good life. I needed something, nothing was available that ticks all my boxes, so I picked these up for pretty cheap to use until that very special monitor comes along one day....

    What I would really like to see is a consistent HDR implementation on the PC. On the PS4, HDR can be a dramatic improvement (it's a per-developer thing, some are better at it than others). I've never been terribly sensitive to FPS dips, but VRR (Freesync/Gsync) would go a long way to help with tearing. 

    My personal priorities are:

    - Physical size: It has to fit on my desk. I like twin monitors side by side, my workflow and gaming habits have more or less grown around that. That's a physical limitation I'm not willing to work around. I have 2x 27" right now, and I couldn't go much larger. I might could support a pair of 32", or doing something strange like a 32" in landscape and a 24" next to it in portrait or something. I don't think I would care for a single 46", three monitors in portrait mode, or an ultrawide form factor. This is a personal preference thing though.
     
    - Sharp, bright colors: OLED is ~the best~, IPS right now is a distant second. I would consider a good VA, but TN isn't an option for me

    - Crisp Image: This is why I chose 4k over high refresh rate, for the extra PPI. I run at about 150% scaling. Not everything in Windows scales well, but the stuff that I use most does.

    - VRR: I currently have an nVidia card, I am not willing to pay extra for Gsync though, especially when I see the same monitor with Freesync often going for $200-$300 less

    - HDR: This actually is a higher priority for me on screens in general, but since Windows is absolutely abhorrent in HDR support, it's lower on the list for a PC monitor.

    - >60Hz Refresh

    I would gladly fork over $2000 for 27" 4K GSync 120Hz HDR monitors, ~if~ they were OLED. Monitors are an item that I don't upgrade often, and I have got a lot of years out of all my previous monitors, so I'm ok with paying for something that should last me for a few generations of hardware. I'd feel a whole lot better about it if we could get to one standard implementation of VRR, and I think the industry will move that way with HDMI 2.1, it just isn't there yet.

    That being said, I also don't game on my PC anywhere near as often as I used to. As a result, I find myself more forgiving of my current hardware, and much more selective when I do decide to finally upgrade something.
    OhhPaigey[Deleted User]
  • OhhPaigeyOhhPaigey Member RarePosts: 1,517
    That's a lot of knowledge here I wasn't otherwise considering, I think I'm going to go with some sort of 2k 144hz monitor because I'd rather have 100+ FPS on max at 2k, than 60 or lower in 4k. I hate feeling sluggish in games. Also low response time is important, because I'm pretty big into FPS games.
    Ozmodan
    When all is said and done, more is always said than done.
Sign In or Register to comment.