Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

GPU and monitor update for 4K gaming?

treyu86treyu86 Member UncommonPosts: 270
Hello all,

I have been thinking on upgrading some pc components to achieve 4K gaming, and I would love to hear advices. This is my actual pc gaming hardware:

- MSI Z370 Gaming Plus motherboard
- Intel core i5 8600K 
- GPU Nvidia Geforce GTX 1080 Ti 11Gb
- 16 Gb RAM DDR4-2133
- 2k monitor Dell Ultrasharp U2515H

I don’t need the super new and expensive gpu, just an improvement from the one I have that lets me get 4k (and raytracing technology) is enough. 

Any help? Thanks in advance!

Comments

  • RidelynnRidelynn Member EpicPosts: 7,383
    edited August 2022
    What's the budget? Do you have any fps preferences or requirements And do you have any target games? You have a monitor in mind?

    Running most 10+ year old MMOs at 4K is an entirely different kettle of fish than, say, running the latest Call of Duty in 4K.

    What you have now will technically drive games at 4K, it's just a matter of how well for a particular game. Many older games will do just fine.

    I played MMOs and similar games at 4K/60 on a 4970 and a GTX980 for a long time - not every setting was at MAX/MAX, but it was certainly playable. Obviously no ray tracing, and if you tried to crank up the graphics settings on new games it would struggle, just a matter of how much your willing to pay to crank your settings up for given games.
  • VrikaVrika Member LegendaryPosts: 7,973
    edited August 2022
    GTX 1080 TI is still a good graphic card. It's about equal to RTX 3060, which is at the moment selling at about $400.

    If you want to upgrade from it now, to get a significant upgrade I'd recommend getting RTX 3080 Ti. 

    But both NVidia and AMD should release their next GPU generation within the next 6 months. As long as all your games work properly with your current monitor and GPU combo, I'd recommend waiting a bit and doing the 4K update when those have released.
     
  • treyu86treyu86 Member UncommonPosts: 270
    Ok, thanks both for the advice @Vrika @Ridelynn. Do you know of any good quality (and in price too :P) 4k monitor?
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Why are you looking to upgrade to a 4K monitor?  I'm not saying that you shouldn't, but different people have different reasons.

    Are you planning on getting a much larger monitor as well?  If you get a 4K monitor the same size as your previous one, then games or other programs that don't scale well to higher resolutions will make everything look tiny and be awkward to use.  That's much more of a problem with older games than newer ones.

    Considering how underwhelming of upgrades the new generations of GeForce cards have been since Pascal (which was an excellent architecture for its day), I'd recommend waiting a bit longer for the new generations of cards to come out unless there's some reason why it's urgent to upgrade today.  If your old card is dead, then sure, you upgrade now.  But if your old card works well and you just feel like upgrading on general principle, then I'd wait.
    Ridelynn
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited September 2022
    treyu86 said:
    Ok, thanks both for the advice @Vrika @Ridelynn. Do you know of any good quality (and in price too :P) 4k monitor?
    My recommendation:

    Don't. At least not for gaming. 

    I love 4K for web browsing, still images, clarity of text -- with the extremely big caveat that scaling is working correctly.

    But for moving images and, largely, gaming - it's mostly wasted.

    HDR can make a huge impact - it's use on the PC is somewhat restricted, but when it gets used, it's much more impactful that 4K.

    I can also say there is something to running at 120Hz or better - it is smoother. I don't find it to be lifechanging, but it's noticeable and nice, and as subtle as I find that effect, I would still rate it higher than the image quality improvement in gaming at 4K.

    I've never been on  the ultrawide kick myself, but most folks I know are gravitating over towards 1440 Ultrawide (3440x1440 -- 1.5x width) or super ultrawide (5120x1440 2x width). I've always just run two monitors, but to each their own.

    4K has two ~huge~ drawbacks with gaming.

    The first is that it just requires a lot of horsepower to run. It can be done, but the benefit over 1440 is marginal at best, in my opinion. You trade an awful lot in either framerate or eye candy in return for some real estate (or PPI), but that was already pretty good for most folks - so that's not a great trade off to make - at least for what it's costing you (not necessarily in terms of cash, but yeah, that too)

    The second is scaling -- you have a lot of older games that just never anticipated running at resolutions that high. And they become almost impossible to play -- the UIs are too small, the text unreadable, just a mess. I find it more trouble than it's worth most of the time, particularly in older games.

    Now, if you still want 4K -- the Dell 4K version of the monitors you have are nice (I have a P2715Q, which is a shade older). It's IPS but backlight bleed isn't horrible, the colors pop nicely, and it's not expensive at all. That said, it's just a basic monitor -- 4K 60Hz SDR -- but it's very good at that without breaking the bank.

    If you want to go HDR and/or 120Hz+ at 4K -- get out your wallet. My advice there is honestly -- Get an LG C1 or C2 OLED. It's 4K, HDR, 120Hz.... in any size you want as long as it's big (42" is the smallest size, the "normal" is 55"). Just don't mind that it's really a TV. It'll cost less than most monitors that have the same feature set, plus they are OLED which blows away IPS or quantum dots or whatever else there is --- just don't forget to set up a screen saver.
    Post edited by Ridelynn on
  • CleffyCleffy Member RarePosts: 6,413
    I have an RTX 3090, raytracing isn't ready for prime time. I cannot get playable franerates in Cyberpunk with raytracing. Any top range GPU from the last 4 years will be good enough for 4k without raytracing. 
  • VrikaVrika Member LegendaryPosts: 7,973
    treyu86 said:
    Ok, thanks both for the advice @Vrika @Ridelynn. Do you know of any good quality (and in price too :P) 4k monitor?
    I'd recommend looking at one of these:
       https://www.pcgamer.com/best-4k-monitors-for-gaming/


    Though to be honest I'd kind of agree with Quizzical and Ridelynn on questioning whether you should get a 4K monitor. Personally if I were buying a new gaming monitor now I'd be looking at the new curved 1440p monitors.
    BrotherMaynard
     
  • BrotherMaynardBrotherMaynard Member RarePosts: 647
    Vrika said:
    treyu86 said:
    Ok, thanks both for the advice @Vrika @Ridelynn. Do you know of any good quality (and in price too :P) 4k monitor?
    I'd recommend looking at one of these:
       https://www.pcgamer.com/best-4k-monitors-for-gaming/


    Though to be honest I'd kind of agree with Quizzical and Ridelynn on questioning whether you should get a 4K monitor. Personally if I were buying a new gaming monitor now I'd be looking at the new curved 1440p monitors.

    The question was about good quality monitors, wasn't it, so going to sites like pcgamer, IGN, tomsguide and similar Walmart-quality websites for this kind of reviews is laughable. Unless you want junk like ASUS, BenQ and similar "quality" monitors, of course. Otherwise it's best to stick to websites and fora aimed at photo / video professionals who actually know what good monitors should be capable of and have experience with a wide range of brands and models.

    120/240 Hz screens might be the holy grail of gamers, but refresh rate in no way indicates quality - usually the opposite. If we are talking good quality, it will not be found in your random cheap gaming rubbish. You will go to monitors like NEC or Eizo. They are no longer as expensive as they used to be (that includes their professional products) and you can now get a non-professional 4k Eizo screen for a bit less than $1000, if my memory serves. A decent alternative used to be semi-professional lineups by Dell and similar manufacturers.

    Talking about gaming monitors and good quality in the same sentence is usually laughable and advising someone who wants quality monitors to look at brands like Gigabyte or Acer, like this pcgamer article does, is like telling an audiophile to buy a cheap plastic Logitech headset.

    Sensai
  • MMOman101MMOman101 Member UncommonPosts: 1,787
    edited September 2022
    Vrika said:
    treyu86 said:
    Ok, thanks both for the advice @Vrika @Ridelynn. Do you know of any good quality (and in price too :P) 4k monitor?
    I'd recommend looking at one of these:
       https://www.pcgamer.com/best-4k-monitors-for-gaming/


    Though to be honest I'd kind of agree with Quizzical and Ridelynn on questioning whether you should get a 4K monitor. Personally if I were buying a new gaming monitor now I'd be looking at the new curved 1440p monitors.

    The question was about good quality monitors, wasn't it, so going to sites like pcgamer, IGN, tomsguide and similar Walmart-quality websites for this kind of reviews is laughable. Unless you want junk like ASUS, BenQ and similar "quality" monitors, of course. Otherwise it's best to stick to websites and fora aimed at photo / video professionals who actually know what good monitors should be capable of and have experience with a wide range of brands and models.

    120/240 Hz screens might be the holy grail of gamers, but refresh rate in no way indicates quality - usually the opposite. If we are talking good quality, it will not be found in your random cheap gaming rubbish. You will go to monitors like NEC or Eizo. They are no longer as expensive as they used to be (that includes their professional products) and you can now get a non-professional 4k Eizo screen for a bit less than $1000, if my memory serves. A decent alternative used to be semi-professional lineups by Dell and similar manufacturers.

    Talking about gaming monitors and good quality in the same sentence is usually laughable and advising someone who wants quality monitors to look at brands like Gigabyte or Acer, like this pcgamer article does, is like telling an audiophile to buy a cheap plastic Logitech headset.


    There are like 3 manufacturers of panels and the software is not incredibly different. I get that panels are manufactured to different specs but thr difference between brands has shrunk considerably over the years. 
    Ridelynn

    “It's unwise to pay too much, but it's worse to pay too little. When you pay too much, you lose a little money - that's all. When you pay too little, you sometimes lose everything, because the thing you bought was incapable of doing the thing it was bought to do. The common law of business balance prohibits paying a little and getting a lot - it can't be done. If you deal with the lowest bidder, it is well to add something for the risk you run, and if you do that you will have enough to pay for something better.”

    --John Ruskin







  • WBadgerWBadger Member RarePosts: 381
    Vrika said:
    treyu86 said:
    Ok, thanks both for the advice @Vrika @Ridelynn. Do you know of any good quality (and in price too :P) 4k monitor?
    I'd recommend looking at one of these:
       https://www.pcgamer.com/best-4k-monitors-for-gaming/


    Though to be honest I'd kind of agree with Quizzical and Ridelynn on questioning whether you should get a 4K monitor. Personally if I were buying a new gaming monitor now I'd be looking at the new curved 1440p monitors.

    The question was about good quality monitors, wasn't it, so going to sites like pcgamer, IGN, tomsguide and similar Walmart-quality websites for this kind of reviews is laughable. Unless you want junk like ASUS, BenQ and similar "quality" monitors, of course. Otherwise it's best to stick to websites and fora aimed at photo / video professionals who actually know what good monitors should be capable of and have experience with a wide range of brands and models.

    120/240 Hz screens might be the holy grail of gamers, but refresh rate in no way indicates quality - usually the opposite. If we are talking good quality, it will not be found in your random cheap gaming rubbish. You will go to monitors like NEC or Eizo. They are no longer as expensive as they used to be (that includes their professional products) and you can now get a non-professional 4k Eizo screen for a bit less than $1000, if my memory serves. A decent alternative used to be semi-professional lineups by Dell and similar manufacturers.

    Talking about gaming monitors and good quality in the same sentence is usually laughable and advising someone who wants quality monitors to look at brands like Gigabyte or Acer, like this pcgamer article does, is like telling an audiophile to buy a cheap plastic Logitech headset.


    Weird to practice elitism towards monitors especially when you're not only in the ballpark of what everyone is talking about but also at the wrong sport across the country altogether.

    Refresh rate has been proven to matter more for gaming then resolution.  It's part of the reason why everybody is recommending going down to 2k; it's because A. a monitor that has a reasonable refresh rate that also hits 4k resolution is going to be expensive as all hell and B. it's also going to require significantly more power then what probably is allowed in his budget.  There's a reason why "gaming," marketed monitors focus on the refresh rate.  Because refresh rate is better for gaming as the monitor updating at a faster pace allows you to track the action easier in the video games that may or may not occasionally need you to react at a split second vs everything being clearer.

    You're also talking about focusing on photo and video professionals on what monitor to get but...again.  wrong sport altogether.  It's like suggesting to someone to get a epyc processor because "professionals recommend it for the amount of raw computing power in them," when that's a server processor that wouldn't be ideal for gaming when that person asked for a gaming processor.
  • BrotherMaynardBrotherMaynard Member RarePosts: 647
    WBadger said:
    snip

    Weird to practice elitism towards monitors especially when you're not only in the ballpark of what everyone is talking about but also at the wrong sport across the country altogether.

    Refresh rate has been proven to matter more for gaming then resolution.  It's part of the reason why everybody is recommending going down to 2k; it's because A. a monitor that has a reasonable refresh rate that also hits 4k resolution is going to be expensive as all hell and B. it's also going to require significantly more power then what probably is allowed in his budget.  There's a reason why "gaming," marketed monitors focus on the refresh rate.  Because refresh rate is better for gaming as the monitor updating at a faster pace allows you to track the action easier in the video games that may or may not occasionally need you to react at a split second vs everything being clearer.

    You're also talking about focusing on photo and video professionals on what monitor to get but...again.  wrong sport altogether.  It's like suggesting to someone to get a epyc processor because "professionals recommend it for the amount of raw computing power in them," when that's a server processor that wouldn't be ideal for gaming when that person asked for a gaming processor.
    That's the sad part about gamers, isn't it, that this is their focus. Monitor is arguably one of the most important parts of your setup, it's what you spend hours looking at each day. Instead of spending more on a good monitor with accurate picture and considerably less strain on their eyes, they spend more on stuff like CPUs and GPUs (which they will probably replace 3-4 times over the lifetime of their monitor). It's like deliberately eating junk food so that you can afford the latest smartphone every two years...

    I wasn't talking about resolution, either. That's already defined by the OP. There are way more important aspects of a PC monitor, including colour reproduction and accuracy, colour gradient, tone, temperature, uniformity across the panel, brightness control and local dimming, and a myriad of other things.

    I am not arguing for getting a $30k - $50k reference monitor, it would be silly. But gamers going "oooh" at every 240 or 360Hz screen and ignoring stuff that actually matters (which many of them will only realise later in their lives during regular visits to opthalmologists) are the other extreme. In a way, gamers are among the best consumer categories to sell stuff to: they generally only need to see higher numbers, grand vacuous slogans and the obligatory "X" slapped all over the product name. (That's probably why there is such a thing as 'gaming chairs', which are usually pretty inferior to any good quality office chair, but gamers buy them like crazy. But I digress...)

    Good quality 4k monitors can be found for $500-1500, but for that you need to shop outside the gaming brands - that's not where the quality is. Like I said, gamers these days will take virtually any junk with big numbers on the box and if somebody points out that's not what good quality is about and recommends shopping outside the likes of ASUS or Gigabyte for their monitors, they must surely be elitist, right?

    MMOman101 said:
    snip

    There are like 3 manufacturers of panels and the software is not incredibly different. I get that panels are manufactured to different specs but thr difference between brands has shrunk considerably over the years. 

    All panels come from just a few manufacturers. In the early LCD days there were just two, LG and Sony (if my memory serves) - now there are more. However, what you wrote is akin to saying all CPUs come from two manufacturers and the only difference is the software they run.

    First, you're only talking about the panels, i.e. the substrate containing the LCD crystals - which is only one part of a very complex product. There is a lot more that the monitor manufacturer adds to the panel, including all the HW actually needed to light and control the panel and all its characteristics.

    Second, there are specialised monitor manufacturers with long-term contracts for their panels who always get the first pick. That's because they are willing to pay more for the best panels and because they need quality since it is the core of their business. A random gaming Walmart, like ASUS will take whatever panels remain, slap angular logos in aggressive colours on the box and the average gamer will be salivating at them.

    Third, there are also differences among the panel manufacturers. Some are more focused on mass production of cheaper panels, others go for better quality and higher price. Every little detail can make a difference, including the choice and quality of the materials (e.g. when making the glass or polymer for the panel), design (electrode pattern), liquid crystals themselves, quality control, etc. Add to it different quality criteria among different panel manufacturers (higher quality specs vs $$$) and you can end up with pretty different products already at the panel assembly stage.

    In short, there's so much more that LCD manufacturers do before they even start designing software for their monitors. Just like you can have vastly different cars built on the same or similar chassis, there is a considerable difference between various monitor manufacturers and the quality of their products.

  • VrikaVrika Member LegendaryPosts: 7,973
    WBadger said:
    snip

    Weird to practice elitism towards monitors especially when you're not only in the ballpark of what everyone is talking about but also at the wrong sport across the country altogether.

    Refresh rate has been proven to matter more for gaming then resolution.  It's part of the reason why everybody is recommending going down to 2k; it's because A. a monitor that has a reasonable refresh rate that also hits 4k resolution is going to be expensive as all hell and B. it's also going to require significantly more power then what probably is allowed in his budget.  There's a reason why "gaming," marketed monitors focus on the refresh rate.  Because refresh rate is better for gaming as the monitor updating at a faster pace allows you to track the action easier in the video games that may or may not occasionally need you to react at a split second vs everything being clearer.

    You're also talking about focusing on photo and video professionals on what monitor to get but...again.  wrong sport altogether.  It's like suggesting to someone to get a epyc processor because "professionals recommend it for the amount of raw computing power in them," when that's a server processor that wouldn't be ideal for gaming when that person asked for a gaming processor.
    That's the sad part about gamers, isn't it, that this is their focus. Monitor is arguably one of the most important parts of your setup, it's what you spend hours looking at each day. Instead of spending more on a good monitor with accurate picture and considerably less strain on their eyes, they spend more on stuff like CPUs and GPUs (which they will probably replace 3-4 times over the lifetime of their monitor). It's like deliberately eating junk food so that you can afford the latest smartphone every two years...

    I wasn't talking about resolution, either. That's already defined by the OP. There are way more important aspects of a PC monitor, including colour reproduction and accuracy, colour gradient, tone, temperature, uniformity across the panel, brightness control and local dimming, and a myriad of other things.

    I am not arguing for getting a $30k - $50k reference monitor, it would be silly. But gamers going "oooh" at every 240 or 360Hz screen and ignoring stuff that actually matters (which many of them will only realise later in their lives during regular visits to opthalmologists) are the other extreme. In a way, gamers are among the best consumer categories to sell stuff to: they generally only need to see higher numbers, grand vacuous slogans and the obligatory "X" slapped all over the product name. (That's probably why there is such a thing as 'gaming chairs', which are usually pretty inferior to any good quality office chair, but gamers buy them like crazy. But I digress...)

    Good quality 4k monitors can be found for $500-1500, but for that you need to shop outside the gaming brands - that's not where the quality is. Like I said, gamers these days will take virtually any junk with big numbers on the box and if somebody points out that's not what good quality is about and recommends shopping outside the likes of ASUS or Gigabyte for their monitors, they must surely be elitist, right?
    You do know that accurate picture and eye strain are two different things, right? If the color is a couple of percents off from what it's supposed to be it may look slightly different, but it won't damage our eyes any more than entering a room that's accidentally been painted with slightly wrong color would do.

    A (professional) monitor that displays still picture as accurately as possible is important for stuff like photo editing, and on general level good for work. Whereas games and entertainment usually benefits more from a monitor that focuses on displaying motion as well as possible. But for eye health both types are about equal.
     
  • BrotherMaynardBrotherMaynard Member RarePosts: 647
    edited September 2022
    Vrika said:
    WBadger said:
    snip

    Weird to practice elitism towards monitors especially when you're not only in the ballpark of what everyone is talking about but also at the wrong sport across the country altogether.

    Refresh rate has been proven to matter more for gaming then resolution.  It's part of the reason why everybody is recommending going down to 2k; it's because A. a monitor that has a reasonable refresh rate that also hits 4k resolution is going to be expensive as all hell and B. it's also going to require significantly more power then what probably is allowed in his budget.  There's a reason why "gaming," marketed monitors focus on the refresh rate.  Because refresh rate is better for gaming as the monitor updating at a faster pace allows you to track the action easier in the video games that may or may not occasionally need you to react at a split second vs everything being clearer.

    You're also talking about focusing on photo and video professionals on what monitor to get but...again.  wrong sport altogether.  It's like suggesting to someone to get a epyc processor because "professionals recommend it for the amount of raw computing power in them," when that's a server processor that wouldn't be ideal for gaming when that person asked for a gaming processor.
    That's the sad part about gamers, isn't it, that this is their focus. Monitor is arguably one of the most important parts of your setup, it's what you spend hours looking at each day. Instead of spending more on a good monitor with accurate picture and considerably less strain on their eyes, they spend more on stuff like CPUs and GPUs (which they will probably replace 3-4 times over the lifetime of their monitor). It's like deliberately eating junk food so that you can afford the latest smartphone every two years...

    I wasn't talking about resolution, either. That's already defined by the OP. There are way more important aspects of a PC monitor, including colour reproduction and accuracy, colour gradient, tone, temperature, uniformity across the panel, brightness control and local dimming, and a myriad of other things.

    I am not arguing for getting a $30k - $50k reference monitor, it would be silly. But gamers going "oooh" at every 240 or 360Hz screen and ignoring stuff that actually matters (which many of them will only realise later in their lives during regular visits to opthalmologists) are the other extreme. In a way, gamers are among the best consumer categories to sell stuff to: they generally only need to see higher numbers, grand vacuous slogans and the obligatory "X" slapped all over the product name. (That's probably why there is such a thing as 'gaming chairs', which are usually pretty inferior to any good quality office chair, but gamers buy them like crazy. But I digress...)

    Good quality 4k monitors can be found for $500-1500, but for that you need to shop outside the gaming brands - that's not where the quality is. Like I said, gamers these days will take virtually any junk with big numbers on the box and if somebody points out that's not what good quality is about and recommends shopping outside the likes of ASUS or Gigabyte for their monitors, they must surely be elitist, right?
    You do know that accurate picture and eye strain are two different things, right? If the color is a couple of percents off from what it's supposed to be it may look slightly different, but it won't damage our eyes any more than entering a room that's accidentally been painted with slightly wrong color would do.

    A (professional) monitor that displays still picture as accurately as possible is important for stuff like photo editing, and on general level good for work. Whereas games and entertainment usually benefits more from a monitor that focuses on displaying motion as well as possible. But for eye health both types are about equal.

    Err... where did I write they are the same thing (or even directly related)? Obviously they are not (at least not directly - I am not sure what incorrect colours displayed at some weird colour temperature can do to our eyes. Nothing good, I suspect.).

    But they are both part of what one would consider a good quality monitor, as mentioned by the OP, aren't they?

    For the last part about eye health: no, not even close. Show me one gaming monitor where you can adjust (or even configure presets) colour temperature together with brightness and ambient light adjustments,  and with circadian dimming.

    Not to mention the overall quality of the monitor (e.g. including the quality and uniformity of the image and of the back light), which I briefly mentioned above. 

  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    edited September 2022
    Gaming monitors focus on refresh rate because reducing the latency between something happening in the game world and it being displayed on your screen will make you better at twitchy games.  For people who want to be highly competitive at twitchy games, reducing that latency is the most important thing that a monitor can do.

    But TN monitors with poor image quality and very high refresh rates are pretty terrible at anything other than twitchy games.  They have to be marketed as gaming monitors because they're terrible at everything else, and far too expensive to really be a cheap monitor.

    Once you get away from twitchy games, or even get into somewhat twitchy games where you don't care about being all that competitive, then maybe it still makes sense to get a 144 Hz monitor, but paying more to go above that is silly.  Good image quality is still nice to have, and will make everything that you use a monitor for more pleasant.

    Outside of the most competitive gamers, I'd recommend IPS monitors just because of the better image quality.  They don't still cost that much more than TN monitors, either.  They're not as good as OLED, but OLED is still really expensive.
    Ridelynn
  • BrotherMaynardBrotherMaynard Member RarePosts: 647
    Quizzical said:
    Gaming monitors focus on refresh rate because reducing the latency between something happening in the game world and it being displayed on your screen will make you better at twitchy games.  For people who want to be highly competitive at twitchy games, reducing that latency is the most important thing that a monitor can do.
    That is true only to a certain extent. Beyond a certain threshold your reaction time will be the major delaying factor in getting things done in a twitch game.

    When I was in Hong Kong, I visited the local science museum. They had an excellent machine that let you test your reaction time in a simulated traffic. You were driving at a certain speed, something happened on the screen, you reacted and it would show you your response time and how far your car would actually stop. It was good fun and my reaction time was anywhere between 0.3 - 0.6 seconds. A lot of factors come in the equation, including your age, physical state, time of day, distractions, etc. Excellent reaction time among humans is apparently around 0.2+ seconds.

    Now take that number and think what a bump from 120 to 240 (or 360) Hz will do in that context. It is such a negligible part of the whole reaction time in a game as to be almost laughable. A 60 Hz monitor shows changes every 0.016 seconds. A 500 Hz screen will do that every 0.002 seconds. Congratulations, you have managed to shave off 0.014 seconds out of 0.2 (in the best cases) or more.

    Beyond a certain (surprisingly low) threshold, your biggest enemy in reducing reaction time is not the refresh rate of your screen, it's your own biology.

    And yet we have this bizarre situation of selling up to 500 Hz monitors with crappy performance in anything but refresh rate to millions of kids and plonkers out there who only see the big numbers and think it will make them better gamers. Because higher numbers = better monitor, right? It would be funny if it wasn't so sad... But it does show how smart the manufacturers are, to be able to sell cheap junk for 5 times its actual worth. When you combine the power of marketing and celebrity names with general ignorance of the buyers, you can achieve virtually anything.

  • VrikaVrika Member LegendaryPosts: 7,973
    Quizzical said:
    Gaming monitors focus on refresh rate because reducing the latency between something happening in the game world and it being displayed on your screen will make you better at twitchy games.  For people who want to be highly competitive at twitchy games, reducing that latency is the most important thing that a monitor can do.

    But TN monitors with poor image quality and very high refresh rates are pretty terrible at anything other than twitchy games.  They have to be marketed as gaming monitors because they're terrible at everything else, and far too expensive to really be a cheap monitor.

    Once you get away from twitchy games, or even get into somewhat twitchy games where you don't care about being all that competitive, then maybe it still makes sense to get a 144 Hz monitor, but paying more to go above that is silly.  Good image quality is still nice to have, and will make everything that you use a monitor for more pleasant.

    Outside of the most competitive gamers, I'd recommend IPS monitors just because of the better image quality.  They don't still cost that much more than TN monitors, either.  They're not as good as OLED, but OLED is still really expensive.
    Gaming monitors and IPS aren't mutually exclusive. For example the gaming monitor list I linked earlier has mostly IPS monitors:
       https://www.pcgamer.com/best-4k-monitors-for-gaming/


    But both IPS and VA panels can be used to make good and bad monitors, and with those you should always read a review for that monitor instead of basing the decision too much on what kind of panel it has.
     
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Quizzical said:
    Gaming monitors focus on refresh rate because reducing the latency between something happening in the game world and it being displayed on your screen will make you better at twitchy games.  For people who want to be highly competitive at twitchy games, reducing that latency is the most important thing that a monitor can do.
    That is true only to a certain extent. Beyond a certain threshold your reaction time will be the major delaying factor in getting things done in a twitch game.

    When I was in Hong Kong, I visited the local science museum. They had an excellent machine that let you test your reaction time in a simulated traffic. You were driving at a certain speed, something happened on the screen, you reacted and it would show you your response time and how far your car would actually stop. It was good fun and my reaction time was anywhere between 0.3 - 0.6 seconds. A lot of factors come in the equation, including your age, physical state, time of day, distractions, etc. Excellent reaction time among humans is apparently around 0.2+ seconds.

    Now take that number and think what a bump from 120 to 240 (or 360) Hz will do in that context. It is such a negligible part of the whole reaction time in a game as to be almost laughable. A 60 Hz monitor shows changes every 0.016 seconds. A 500 Hz screen will do that every 0.002 seconds. Congratulations, you have managed to shave off 0.014 seconds out of 0.2 (in the best cases) or more.

    Beyond a certain (surprisingly low) threshold, your biggest enemy in reducing reaction time is not the refresh rate of your screen, it's your own biology.

    And yet we have this bizarre situation of selling up to 500 Hz monitors with crappy performance in anything but refresh rate to millions of kids and plonkers out there who only see the big numbers and think it will make them better gamers. Because higher numbers = better monitor, right? It would be funny if it wasn't so sad... But it does show how smart the manufacturers are, to be able to sell cheap junk for 5 times its actual worth. When you combine the power of marketing and celebrity names with general ignorance of the buyers, you can achieve virtually anything.
    Taking 5 ms off of your reaction time by updating the monitor 5 ms sooner isn't a very big deal for most purposes.  But for highly competitive e-sports in twitchy games, it can be the difference between winning and losing.  For comparison, the difference between an NFL player who runs a 4.35 second 40 yard dash and one who runs a 4.40 second 40 yard dash is significant.  Things that aren't a big deal for a casual competition can be quite important at the highest levels of competition.

    Even so, not all latency is the same, and latency between when you expect something to appear and when it actually does can be noticeable on much smaller scales than you seem to think.  Some years ago, I was developing an amateur game and noticed that the controls felt really laggy.  The controls had previously felt responsive.  It took me a while to track down why.  I had made a mistake in some threading logic that added about 40 ms to the latency of when something happened.  That difference of 40 ms was the difference between "controls feel good" and "something feels is horribly broken".  Once I fixed the mistake, the controls felt good again.

    The thing about input latency like that is that it's not measuring your reaction time to things appearing unexpectedly.  It's reaction time to the player pressing a button and expecting it to affect what appears on the screen immediately, then being annoyed at a delayed reaction.

    If you've ever played a game that offered both a hardware mouse pointer and a software pointer and tried both, the software pointer will feel so laggy as to be broken, even though the difference in latency difference is only in the tens of milliseconds.
  • BrotherMaynardBrotherMaynard Member RarePosts: 647
    edited September 2022
    Quizzical said:
    snip
    Taking 5 ms off of your reaction time by updating the monitor 5 ms sooner isn't a very big deal for most purposes.  But for highly competitive e-sports in twitchy games, it can be the difference between winning and losing.  For comparison, the difference between an NFL player who runs a 4.35 second 40 yard dash and one who runs a 4.40 second 40 yard dash is significant.  Things that aren't a big deal for a casual competition can be quite important at the highest levels of competition.

    That's not a good example. A 40m dash in a straight line requires your muscles to continuously work at peak capacity, plus making good use of the momentum. There is no twitch movement based on sensory input involved in such dash, you do not need to turn 90 degrees at random moments or stop mid-run based on what you see. How fast that player can react to stuff happening on the field is another matter entirely and can't be measured by simply comparing speed and observing a 0.05s difference at the finish line.


    Even so, not all latency is the same, and latency between when you expect something to appear and when it actually does can be noticeable on much smaller scales than you seem to think.  Some years ago, I was developing an amateur game and noticed that the controls felt really laggy.  The controls had previously felt responsive.  It took me a while to track down why.  I had made a mistake in some threading logic that added about 40 ms to the latency of when something happened.  That difference of 40 ms was the difference between "controls feel good" and "something feels is horribly broken".  Once I fixed the mistake, the controls felt good again.
    It seems to me what you describe here has more to do with hand-eye coordination and the way brain expects something while something slightly different is happening on the screen; the perception of lag is not as much a result of reaction time, but of the slight delay your brain perceives and makes you aware of it through a slightly uncomfortable feeling. A bit like the way our internal balance system can get messed up with a slightly de-synced body movement vs visual input, e.g. in motion sickness.

    But this is not actually related to the refresh time of your screen. Movement shown on a 120Hz screen and on a 240Hz screen will be the same and the difference in how fast things appear will be entirely negligible, especially considering human reaction time. Whether the signal arrives in your brain 0.014s earlier makes no difference when your brain takes 0.2s (and more) to register it and fire up the required neurons in response.


    The thing about input latency like that is that it's not measuring your reaction time to things appearing unexpectedly.  It's reaction time to the player pressing a button and expecting it to affect what appears on the screen immediately, then being annoyed at a delayed reaction.

    But that is exactly what this is about, isn't it, going back to the earlier example of twitch games. It is about things (enemy player) appearing unexpectedly and you having to react before he does.

    I haven't seen any conclusive evidence that a 500, 360 or 240 Hz monitor causes better reaction time in gaming - not even competitive one. What I do see, however, is plenty of sponsored teams standing in front of their sponsors' logos happily saying "yeah, this 500Hz screen makes me Godlike!". Considering that such players are usually of a similar age, perform in the same (tournament) venue and given sufficiently large sample, they should be able to back up their claims with solid evidence.

    Put two similarly profiled groups of players in the same room, with the same hardware, but with two sets of monitors, one with 120Hz and the other 240Hz, let them play repeatedly over several weeks and then analyse the results. I would eat my hat if you would see a statistically significant evidence of the 240Hz monitors resulting in consistently better scores. No, all I see is marketing. Leaving this 0.000001% of player population aside, refresh rate has an even more negligible importance.

    We might be getting a bit too far from the original topic, though, so just to get back: as far as "good quality" monitors go, you would do much better to shop outside of the 'gaming' brands. You can get an excellent 60 - 120 Hz monitor to which the gaming-branded junk will barely compare - and it will be perfectly suitable for gaming too.

  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Quizzical said:
    Taking 5 ms off of your reaction time by updating the monitor 5 ms sooner isn't a very big deal for most purposes.  But for highly competitive e-sports in twitchy games, it can be the difference between winning and losing.  For comparison, the difference between an NFL player who runs a 4.35 second 40 yard dash and one who runs a 4.40 second 40 yard dash is significant.  Things that aren't a big deal for a casual competition can be quite important at the highest levels of competition.

    That's not a good example. A 40m dash in a straight line requires your muscles to continuously work at peak capacity, plus making good use of the momentum. There is no twitch movement based on sensory input involved in such dash, you do not need to turn 90 degrees at random moments or stop mid-run based on what you see. How fast that player can react to stuff happening on the field is another matter entirely and can't be measured by simply comparing speed and observing a 0.05s difference at the finish line.
    My point is that a small percentage difference can make a large difference in outcomes at the highest levels of competition.  This fact doesn't depend on the level of physical exertion involved.


    Even so, not all latency is the same, and latency between when you expect something to appear and when it actually does can be noticeable on much smaller scales than you seem to think.  Some years ago, I was developing an amateur game and noticed that the controls felt really laggy.  The controls had previously felt responsive.  It took me a while to track down why.  I had made a mistake in some threading logic that added about 40 ms to the latency of when something happened.  That difference of 40 ms was the difference between "controls feel good" and "something feels is horribly broken".  Once I fixed the mistake, the controls felt good again.
    It seems to me what you describe here has more to do with hand-eye coordination and the way brain expects something while something slightly different is happening on the screen; the perception of lag is not as much a result of reaction time, but of the slight delay your brain perceives and makes you aware of it through a slightly uncomfortable feeling. A bit like the way our internal balance system can get messed up with a slightly de-synced body movement vs visual input, i.e. in motion sickness.

    But this is not actually related to the refresh time of your screen. Movement shown on a 120Hz screen and on a 240Hz screen will be the same and the difference in how fast things appear will be entirely negligible, especially considering human reaction time. Whether the signal arrives in your brain 0.014s earlier makes no difference when your brain takes 0.2s (and more) to register it and fire up the required neurons in response.
    In the example I gave, the latency difference was 0.04 seconds.  That isn't just guessing or making up numbers.  That's an empirical measurement.  I wrote the source code and had the ability to modify it to check system timers and see what was happening.  And it made a huge difference in how responsive the controls felt.

    Yes, 0.01 seconds is a lot less than that.  But that's still a meaningful difference in how smooth controls will feel.  I'm not just guessing here.  I know from personal experience with inserting and checking system timers.  How about you?  Have you ever tested your claims to that degree?


    The thing about input latency like that is that it's not measuring your reaction time to things appearing unexpectedly.  It's reaction time to the player pressing a button and expecting it to affect what appears on the screen immediately, then being annoyed at a delayed reaction.

    But that is exactly what this is about, isn't it, going back to the earlier example of twitch games. It is about things (enemy player) appearing unexpectedly and you having to react before he does.

    I haven't seen any conclusive evidence that a 500, 360 or 240 Hz monitor causes better reaction time in gaming - not even competitive one. What I do see, however, is plenty of sponsored teams standing in front of their sponsors' logos happily saying "yeah, this 500Hz screen makes me Godlike!". Considering that such players are usually of a similar age, perform in the same (tournament) venue and given sufficiently large sample, they should be able to back up their claims with solid evidence.

    Put two similarly profiled groups of players in the same room, with the same hardware, but with two sets of monitors, one with 120Hz and the other 240Hz, let them play repeatedly over several weeks and then analyse the results. I would eat my hat if you would see a statistically significant evidence of the 240Hz monitors resulting in consistently better scores. No, all I see is marketing. Leaving this 0.000001% of player population aside, refresh rate has an even more negligible importance.

    We might be getting a bit too far from the original topic, though, so just to get back: as far as "good quality" monitors go, you would do much better to shop outside of the 'gaming' brands. You can get an excellent 60 - 120 Hz monitor to which the gaming-branded junk will barely compare - and it will be perfectly suitable for gaming too.

    I find it very implausible that getting an image up on the screen 2 ms faster won't tend to lead to a player reaction coming 2 ms sooner.  You can argue that that doesn't matter very much.  And in most cases, I'd even agree with that, or at least that it's less important than the difference between good image quality and bad image quality.  But there's still going to be a real latency difference.
  • RidelynnRidelynn Member EpicPosts: 7,383
    edited September 2022
    That's not a good example. A 40m dash in a straight line requires your muscles to continuously work at peak capacity, plus making good use of the momentum.

    You completely missed the point. Bless your heart. A for effort though on the reply.
Sign In or Register to comment.