Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Rig Questions...

adderVXIadderVXI Member UncommonPosts: 727
Hello all,

Im considering either a new build of refreshing my current one to run a 27" 165hz monitor at 2k.  I have a i7 3770k 3.5Ghz, with a Zotac 1060 6gb currently.  Not to sure if i can run this monitor effectively however.  I will be using it for planetside2, Wow classic, Warframe etc.

If its on the weak side i can buy a card or build something new but im not sure what makes the most sense. 

Thanks!

Government is not reason; it is not eloquent; it is force. Like fire, it is a dangerous servant and a fearful master.

George Washington

Comments

  • CleffyCleffy Member RarePosts: 6,413
    You won't get any better than that at a reasonable cost. It should be fine for 2k. Most of the higher end solutions are for running things your monitor can't like 4K resolutions over 60 fps or ray-tracing.
    KyutaSyukoRidelynn
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Do you already have the monitor, or are you looking to buy a new monitor, or what?
  • adderVXIadderVXI Member UncommonPosts: 727
    Quizzical said:
    Do you already have the monitor, or are you looking to buy a new monitor, or what?
    I dont have one yet but planning on buying on in a month or so.  I guess my consern was buying a 600.00 monitor and my systen cant handle it.  Been using 60hz forever and want to upgrade..

    Government is not reason; it is not eloquent; it is force. Like fire, it is a dangerous servant and a fearful master.

    George Washington

  • adderVXIadderVXI Member UncommonPosts: 727
    Cleffy said:
    You won't get any better than that at a reasonable cost. It should be fine for 2k. Most of the higher end solutions are for running things your monitor can't like 4K resolutions over 60 fps or ray-tracing.
    Cool.  I would just build a whole new system or upgrade gfx cards but if this work for a while longer ill hold off.  Sounds like it will.  Thanks

    Government is not reason; it is not eloquent; it is force. Like fire, it is a dangerous servant and a fearful master.

    George Washington

  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Check the monitor ports on the back of your video card, as well as what the monitor requires.  You probably want to use DisplayPort for both, as that's the modern standard.  It's probable that your video card will support the monitor, especially if you only want one monitor, but you might as well check to make sure.  Even if the monitor supports HDMI, it won't necessarily support it at the refresh rate you want.  For example, my monitors can do 144 Hz over DisplayPort or 60 Hz over HDMI.

    That you have a GTX 1060 says which GPU chip is inside of the card, but doesn't tell you which monitor ports it has.  The combination of monitor ports can vary by SKU, even with the same GPU chip.  The way to find out what your video card has is to look at the back of the card--which is visible without having to open up your case.

    Now, it's possible that in some games, with your current video card, you'd get 50 frames per second, both on the old monitor and the new.  If what you want is to actually hit 165 frames per second, then that's much, much harder.  Really, though, getting 100 frames per second on a 144 Hz monitor looks pretty good.

    Don't buy an overpriced G-Sync monitor.  Nvidia finally decided to support adaptive sync, which means that there's no sense in paying extra $150 that it costs to get the same thing via Nvidia's proprietary implementation.  All that that extra $150 would get you is reduced compatibility with future video cards that you might buy.
  • adderVXIadderVXI Member UncommonPosts: 727
    Quizzical said:
    Check the monitor ports on the back of your video card, as well as what the monitor requires.  You probably want to use DisplayPort for both, as that's the modern standard.  It's probable that your video card will support the monitor, especially if you only want one monitor, but you might as well check to make sure.  Even if the monitor supports HDMI, it won't necessarily support it at the refresh rate you want.  For example, my monitors can do 144 Hz over DisplayPort or 60 Hz over HDMI.

    That you have a GTX 1060 says which GPU chip is inside of the card, but doesn't tell you which monitor ports it has.  The combination of monitor ports can vary by SKU, even with the same GPU chip.  The way to find out what your video card has is to look at the back of the card--which is visible without having to open up your case.

    Now, it's possible that in some games, with your current video card, you'd get 50 frames per second, both on the old monitor and the new.  If what you want is to actually hit 165 frames per second, then that's much, much harder.  Really, though, getting 100 frames per second on a 144 Hz monitor looks pretty good.

    Don't buy an overpriced G-Sync monitor.  Nvidia finally decided to support adaptive sync, which means that there's no sense in paying extra $150 that it costs to get the same thing via Nvidia's proprietary implementation.  All that that extra $150 would get you is reduced compatibility with future video cards that you might buy.
    I checked and i have a few displayport ports, so i should be good there. Ive notice that freesync (which is what your refering i assume) monitors are way less money too.  

    Government is not reason; it is not eloquent; it is force. Like fire, it is a dangerous servant and a fearful master.

    George Washington

  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    adderVXI said:
    Quizzical said:
    Check the monitor ports on the back of your video card, as well as what the monitor requires.  You probably want to use DisplayPort for both, as that's the modern standard.  It's probable that your video card will support the monitor, especially if you only want one monitor, but you might as well check to make sure.  Even if the monitor supports HDMI, it won't necessarily support it at the refresh rate you want.  For example, my monitors can do 144 Hz over DisplayPort or 60 Hz over HDMI.

    That you have a GTX 1060 says which GPU chip is inside of the card, but doesn't tell you which monitor ports it has.  The combination of monitor ports can vary by SKU, even with the same GPU chip.  The way to find out what your video card has is to look at the back of the card--which is visible without having to open up your case.

    Now, it's possible that in some games, with your current video card, you'd get 50 frames per second, both on the old monitor and the new.  If what you want is to actually hit 165 frames per second, then that's much, much harder.  Really, though, getting 100 frames per second on a 144 Hz monitor looks pretty good.

    Don't buy an overpriced G-Sync monitor.  Nvidia finally decided to support adaptive sync, which means that there's no sense in paying extra $150 that it costs to get the same thing via Nvidia's proprietary implementation.  All that that extra $150 would get you is reduced compatibility with future video cards that you might buy.
    I checked and i have a few displayport ports, so i should be good there. Ive notice that freesync (which is what your refering i assume) monitors are way less money too.  
    Basically, the history there is that the industry standard is called Adaptive Sync.  AMD calls their implementation of it FreeSync.  Nvidia decided not to support the industry standard, but made up their own way to do the same thing called G-Sync, which was incompatible with what the rest of the world was doing.  Both FreeSync and G-Sync demand a little more than the basic industry standard, but they're basically equivalent.

    If you're a monitor vendor and are going to build a nice gaming monitor, getting FreeSync support is nearly free.  To get G-Sync support, you had to buy some physical module from Nvidia that cost $100.  Once various vendors took their markup, that typically adds about $150 to the cost of the monitor at retail to make a monitor that would be otherwise identical (size, resolution, refresh rate, panel quality, etc.) to a FreeSync one.

    For about four years it was the case that a FreeSync monitor would support variable refresh rates with an AMD GPU but not Nvidia, while a G-Sync monitor would support variable refresh rates with an Nvidia GPU but not AMD.  G-Sync monitors will never be able to support that on an AMD GPU; Nvidia making it incompatible is the whole reason for G-Sync to exist.  Now Nvidia is hoping that people who happen to have a G-Sync monitor will feel forced to buy an Nvidia GPU in the future even if AMD offers a better deal for their budget.

    A FreeSync monitor could support variable refresh rates on Nvidia GPUs if Nvidia wanted them to, but four about four years, they simply refused, crippling their own drivers to spite their own customers for not buying into their vendor lock-in strategy.  About a month ago, Nvidia finally agreed to support the industry standard that the rest of the world was using and issued the necessary driver update to let your card support it--but only for Pascal or later cards, not the older Maxwell.

    That's why G-Sync monitors cost more.  And the lack of compatibility also makes them an outright inferior option as compared to FreeSync.  With G-Sync basically dead as a result, it's not guaranteed that it will still have the driver support that you'd hope for several years from now, as they'll all be old, discontinued, legacy products by then.  I'm basically saying that you don't want to buy into G-Sync now, with the possible exception of if you find a monitor at clearance prices that make it considerably cheaper than a FreeSync monitor.

    Nvidia has published a list of "G-Sync compatible" monitors here:

    https://www.nvidia.com/en-us/geforce/news/g-sync-ces-2019-announcements/

    They're basically arguing that most of the FreeSync monitors out there aren't actually as good as real G-Sync monitors, but the ones on that list are.  For the most part, that's marketing FUD and nit-picking.  But some of them aren't that expensive, and all else equal, you'd rather have a monitor that properly supports everything that Nvidia thinks it should than one that doesn't.

    All else might not be equal, though, and you don't want to pay an extra $200 for Nvidia certification, as most of the ones that they found one excuse or another not to call "compatible" will be fine.  Nvidia will enable variable refresh rate for you by default if the monitor is "G-Sync compatible", but they'll also let you enable it yourself if it isn't.
    adderVXI
  • adderVXIadderVXI Member UncommonPosts: 727
    Quizzical said:
    adderVXI said:
    Quizzical said:
    Check the monitor ports on the back of your video card, as well as what the monitor requires.  You probably want to use DisplayPort for both, as that's the modern standard.  It's probable that your video card will support the monitor, especially if you only want one monitor, but you might as well check to make sure.  Even if the monitor supports HDMI, it won't necessarily support it at the refresh rate you want.  For example, my monitors can do 144 Hz over DisplayPort or 60 Hz over HDMI.

    That you have a GTX 1060 says which GPU chip is inside of the card, but doesn't tell you which monitor ports it has.  The combination of monitor ports can vary by SKU, even with the same GPU chip.  The way to find out what your video card has is to look at the back of the card--which is visible without having to open up your case.

    Now, it's possible that in some games, with your current video card, you'd get 50 frames per second, both on the old monitor and the new.  If what you want is to actually hit 165 frames per second, then that's much, much harder.  Really, though, getting 100 frames per second on a 144 Hz monitor looks pretty good.

    Don't buy an overpriced G-Sync monitor.  Nvidia finally decided to support adaptive sync, which means that there's no sense in paying extra $150 that it costs to get the same thing via Nvidia's proprietary implementation.  All that that extra $150 would get you is reduced compatibility with future video cards that you might buy.
    I checked and i have a few displayport ports, so i should be good there. Ive notice that freesync (which is what your refering i assume) monitors are way less money too.  
    Basically, the history there is that the industry standard is called Adaptive Sync.  AMD calls their implementation of it FreeSync.  Nvidia decided not to support the industry standard, but made up their own way to do the same thing called G-Sync, which was incompatible with what the rest of the world was doing.  Both FreeSync and G-Sync demand a little more than the basic industry standard, but they're basically equivalent.

    If you're a monitor vendor and are going to build a nice gaming monitor, getting FreeSync support is nearly free.  To get G-Sync support, you had to buy some physical module from Nvidia that cost $100.  Once various vendors took their markup, that typically adds about $150 to the cost of the monitor at retail to make a monitor that would be otherwise identical (size, resolution, refresh rate, panel quality, etc.) to a FreeSync one.

    For about four years it was the case that a FreeSync monitor would support variable refresh rates with an AMD GPU but not Nvidia, while a G-Sync monitor would support variable refresh rates with an Nvidia GPU but not AMD.  G-Sync monitors will never be able to support that on an AMD GPU; Nvidia making it incompatible is the whole reason for G-Sync to exist.  Now Nvidia is hoping that people who happen to have a G-Sync monitor will feel forced to buy an Nvidia GPU in the future even if AMD offers a better deal for their budget.

    A FreeSync monitor could support variable refresh rates on Nvidia GPUs if Nvidia wanted them to, but four about four years, they simply refused, crippling their own drivers to spite their own customers for not buying into their vendor lock-in strategy.  About a month ago, Nvidia finally agreed to support the industry standard that the rest of the world was using and issued the necessary driver update to let your card support it--but only for Pascal or later cards, not the older Maxwell.

    That's why G-Sync monitors cost more.  And the lack of compatibility also makes them an outright inferior option as compared to FreeSync.  With G-Sync basically dead as a result, it's not guaranteed that it will still have the driver support that you'd hope for several years from now, as they'll all be old, discontinued, legacy products by then.  I'm basically saying that you don't want to buy into G-Sync now, with the possible exception of if you find a monitor at clearance prices that make it considerably cheaper than a FreeSync monitor.

    Nvidia has published a list of "G-Sync compatible" monitors here:

    https://www.nvidia.com/en-us/geforce/news/g-sync-ces-2019-announcements/

    They're basically arguing that most of the FreeSync monitors out there aren't actually as good as real G-Sync monitors, but the ones on that list are.  For the most part, that's marketing FUD and nit-picking.  But some of them aren't that expensive, and all else equal, you'd rather have a monitor that properly supports everything that Nvidia thinks it should than one that doesn't.

    All else might not be equal, though, and you don't want to pay an extra $200 for Nvidia certification, as most of the ones that they found one excuse or another not to call "compatible" will be fine.  Nvidia will enable variable refresh rate for you by default if the monitor is "G-Sync compatible", but they'll also let you enable it yourself if it isn't.
    Wow, thanks for all the info!!  This cleared up a bunch of questions i had, that i didnt even know i had lol.

    Government is not reason; it is not eloquent; it is force. Like fire, it is a dangerous servant and a fearful master.

    George Washington

  • TheDarkrayneTheDarkrayne Member EpicPosts: 5,297
    From personal experience, a 1060 has trouble maintaining even 60 fps at 1080p in most modern games. I had to lower the settings in Anthem, for example. Can have textures on max with little difference but had to lower some other settings to keep it in the 50-60 range. Anthem might be better optimized at release though, who knows. I have to use adaptive settings in Shadow of War for 60@1080p (which isn't always 1080p since it's adaptive). 

    Depends what you want to play tbh. Warframe is probably the most demanding on your list but it's still not very demanding compared to most newer games.

    I'd definitely upgrade to at least a 2070 if I had the spare cash, personally.

    But your new monitor will work fine, you just won't see the full benefit of having it without having to lower game settings in some newer games until you get a better GPU.
    I don't suffer from insanity, I enjoy every minute of it.
  • adderVXIadderVXI Member UncommonPosts: 727
    From personal experience, a 1060 has trouble maintaining even 60 fps at 1080p in most modern games. I had to lower the settings in Anthem, for example. Can have textures on max with little difference but had to lower some other settings to keep it in the 50-60 range. Anthem might be better optimized at release though, who knows. I have to use adaptive settings in Shadow of War for 60@1080p (which isn't always 1080p since it's adaptive). 

    Depends what you want to play tbh. Warframe is probably the most demanding on your list but it's still not very demanding compared to most newer games.

    I'd definitely upgrade to at least a 2070 if I had the spare cash, personally.

    But your new monitor will work fine, you just won't see the full benefit of having it without having to lower game settings in some newer games until you get a better GPU.
    I guess i could get the monitor and if im coming up short and dont like the amount i have to lower the  gfx settings i can the do a new card.  Thanks!

    Government is not reason; it is not eloquent; it is force. Like fire, it is a dangerous servant and a fearful master.

    George Washington

  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Because it's so easy to upgrade a monitor independent of upgrading other components, my recommendation would be to get the monitor you want first.  Plug it in and see if you get the performance you're after.  If you do, you're done.  Only upgrade the video card if you run into a situation where the one you have isn't good enough.
    Ridelynn
  • adderVXIadderVXI Member UncommonPosts: 727
    adderVXI said:
    Quizzical said:
    adderVXI said:
    Quizzical said:
    Check the monitor ports on the back of your video card, as well as what the monitor requires.  You probably want to use DisplayPort for both, as that's the modern standard.  It's probable that your video card will support the monitor, especially if you only want one monitor, but you might as well check to make sure.  Even if the monitor supports HDMI, it won't necessarily support it at the refresh rate you want.  For example, my monitors can do 144 Hz over DisplayPort or 60 Hz over HDMI.

    That you have a GTX 1060 says which GPU chip is inside of the card, but doesn't tell you which monitor ports it has.  The combination of monitor ports can vary by SKU, even with the same GPU chip.  The way to find out what your video card has is to look at the back of the card--which is visible without having to open up your case.

    Now, it's possible that in some games, with your current video card, you'd get 50 frames per second, both on the old monitor and the new.  If what you want is to actually hit 165 frames per second, then that's much, much harder.  Really, though, getting 100 frames per second on a 144 Hz monitor looks pretty good.

    Don't buy an overpriced G-Sync monitor.  Nvidia finally decided to support adaptive sync, which means that there's no sense in paying extra $150 that it costs to get the same thing via Nvidia's proprietary implementation.  All that that extra $150 would get you is reduced compatibility with future video cards that you might buy.
    I checked and i have a few displayport ports, so i should be good there. Ive notice that freesync (which is what your refering i assume) monitors are way less money too.  
    Basically, the history there is that the industry standard is called Adaptive Sync.  AMD calls their implementation of it FreeSync.  Nvidia decided not to support the industry standard, but made up their own way to do the same thing called G-Sync, which was incompatible with what the rest of the world was doing.  Both FreeSync and G-Sync demand a little more than the basic industry standard, but they're basically equivalent.

    If you're a monitor vendor and are going to build a nice gaming monitor, getting FreeSync support is nearly free.  To get G-Sync support, you had to buy some physical module from Nvidia that cost $100.  Once various vendors took their markup, that typically adds about $150 to the cost of the monitor at retail to make a monitor that would be otherwise identical (size, resolution, refresh rate, panel quality, etc.) to a FreeSync one.

    For about four years it was the case that a FreeSync monitor would support variable refresh rates with an AMD GPU but not Nvidia, while a G-Sync monitor would support variable refresh rates with an Nvidia GPU but not AMD.  G-Sync monitors will never be able to support that on an AMD GPU; Nvidia making it incompatible is the whole reason for G-Sync to exist.  Now Nvidia is hoping that people who happen to have a G-Sync monitor will feel forced to buy an Nvidia GPU in the future even if AMD offers a better deal for their budget.

    A FreeSync monitor could support variable refresh rates on Nvidia GPUs if Nvidia wanted them to, but four about four years, they simply refused, crippling their own drivers to spite their own customers for not buying into their vendor lock-in strategy.  About a month ago, Nvidia finally agreed to support the industry standard that the rest of the world was using and issued the necessary driver update to let your card support it--but only for Pascal or later cards, not the older Maxwell.

    That's why G-Sync monitors cost more.  And the lack of compatibility also makes them an outright inferior option as compared to FreeSync.  With G-Sync basically dead as a result, it's not guaranteed that it will still have the driver support that you'd hope for several years from now, as they'll all be old, discontinued, legacy products by then.  I'm basically saying that you don't want to buy into G-Sync now, with the possible exception of if you find a monitor at clearance prices that make it considerably cheaper than a FreeSync monitor.

    Nvidia has published a list of "G-Sync compatible" monitors here:

    https://www.nvidia.com/en-us/geforce/news/g-sync-ces-2019-announcements/

    They're basically arguing that most of the FreeSync monitors out there aren't actually as good as real G-Sync monitors, but the ones on that list are.  For the most part, that's marketing FUD and nit-picking.  But some of them aren't that expensive, and all else equal, you'd rather have a monitor that properly supports everything that Nvidia thinks it should than one that doesn't.

    All else might not be equal, though, and you don't want to pay an extra $200 for Nvidia certification, as most of the ones that they found one excuse or another not to call "compatible" will be fine.  Nvidia will enable variable refresh rate for you by default if the monitor is "G-Sync compatible", but they'll also let you enable it yourself if it isn't.
    Wow, thanks for all the info!!  This cleared up a bunch of questions i had, that i didnt even know i had lol.
    So this was what i was looking at for a monitor..
    https://www.amazon.com/Swift-PG278QR-G-SYNC-Gaming-Monitor/dp/B01N4ENDXR/ref=pd_aw_sbs_147_1/130-7039740-1093727?_encoding=UTF8&pd_rd_i=B01N4ENDXR&pd_rd_r=6863f708-2b5f-11e9-8a86-1d7479451c7c&pd_rd_w=3dB0i&pd_rd_wg=e1tpI&pf_rd_p=aae79475-6dc9-4a12-80e8-27b63108fa72&pf_rd_r=R24KYPDVHWZHZM9MHDRV&psc=1&refRID=R24KYPDVHWZHZM9MHDRV

    But after our discussion could you recommend something else?

    Government is not reason; it is not eloquent; it is force. Like fire, it is a dangerous servant and a fearful master.

    George Washington

  • RidelynnRidelynn Member EpicPosts: 7,383
    https://www.amazon.com/Samsung-C27HG70-27-Inch-Monitor-C27HG70QQN/dp/B06XSQ5QN8

    That is what I would get. Same screen size and resolution, better screen (arguably, VA versus TN - much better color and viewing angles). Sorta kinda supports HDR and Freesync 2. Almost $100 less.
    adderVXI
  • RidelynnRidelynn Member EpicPosts: 7,383
    Also, I totally agree with get the monitor, see how your system performs, then upgrade if needed.

    I have a feeling it will perform fairly well - may not be able to MAX MAX every setting and stay at 144Hz dead on, but I bet it will do well to look good, especially with VRR in the mix.
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    High refresh rate monitors tend to be 144 Hz, not 165 Hz.  If you're dead set on 165 Hz, then that will limit your options a lot.  What Ridelynn mentions is one alternative.  Another is this:

    https://www.amazon.com/Acer-XG270HU-omidpx-FREESYNC-Widescreen/dp/B00VRCLHYS

    That's about equivalent to what you linked, but a whole lot cheaper.  It's also officially "G-Sync compatible", so Nvidia officially thinks that it's a good monitor, too.

    Another alternative is the one I have:

    https://www.amazon.com/27-inch-FreeSync-Response-DisplayPort-MG279Q/dp/B00ZOO348C

    Similar specs and a similar price to what you linked, but with an IPS panel, so much better image quality.  It doesn't offer a variable refresh rate above 90 Hz, so you can't get 144 frames per second, it has to drop all the way to 90.  But still, 90 frames per second is a lot.  That's likely the reason why it's not "G-Sync compatible".

    Whether you want a better refresh rate or better image quality is a matter of personal opinion.
    Gdemami
  • adderVXIadderVXI Member UncommonPosts: 727
    Ridelynn said:
    Also, I totally agree with get the monitor, see how your system performs, then upgrade if needed.

    I have a feeling it will perform fairly well - may not be able to MAX MAX every setting and stay at 144Hz dead on, but I bet it will do well to look good, especially with VRR in the mix.
    I will check that one out!  Thanks!

    Government is not reason; it is not eloquent; it is force. Like fire, it is a dangerous servant and a fearful master.

    George Washington

  • adderVXIadderVXI Member UncommonPosts: 727
    Quizzical said:
    High refresh rate monitors tend to be 144 Hz, not 165 Hz.  If you're dead set on 165 Hz, then that will limit your options a lot.  What Ridelynn mentions is one alternative.  Another is this:

    https://www.amazon.com/Acer-XG270HU-omidpx-FREESYNC-Widescreen/dp/B00VRCLHYS

    That's about equivalent to what you linked, but a whole lot cheaper.  It's also officially "G-Sync compatible", so Nvidia officially thinks that it's a good monitor, too.

    Another alternative is the one I have:

    https://www.amazon.com/27-inch-FreeSync-Response-DisplayPort-MG279Q/dp/B00ZOO348C

    Similar specs and a similar price to what you linked, but with an IPS panel, so much better image quality.  It doesn't offer a variable refresh rate above 90 Hz, so you can't get 144 frames per second, it has to drop all the way to 90.  But still, 90 frames per second is a lot.  That's likely the reason why it's not "G-Sync compatible".

    Whether you want a better refresh rate or better image quality is a matter of personal opinion.
    Awesome!  Thanks for all the help!!  This will save me a bunch of cash.

    Government is not reason; it is not eloquent; it is force. Like fire, it is a dangerous servant and a fearful master.

    George Washington

Sign In or Register to comment.