Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Nvidia RTX 3060 Ti Founders Edition Review | MMORPG.com

SystemSystem Member UncommonPosts: 12,599
edited December 2020 in News & Features Discussion

imageNvidia RTX 3060 Ti Founders Edition Review | MMORPG.com

With AMD and NVIDIA trading blows at the upper reaches of the GPU market, could NVIDIA's GeForce RTX 3060 Ti be right onramp into the RTX 30-series at a fraction of the cost? We are about to find out.

Read the full story here


«1

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    And now we start seeing the new cards come toward the price ranges that more gamers are willing to pay. Having reviews go up the day before parts go on sale doesn't imply confidence in a hard launch. Really, though, we'll probably see a lot of these come to market pretty soon, as they're just salvage parts of the same die that the RTX 3070 used, and this card doesn't have the same risk factors that have caused the RTX 3080/3090 to have barely shipped any cards even 2 1/2 months after launch.

    AMD will have eventually have something new in this price range, too. But the real question is when you'll actually be able to buy the cards, not when the launch day will be. I'd bet that Nvidia gets there first, as AMD still has Zen 3 chiplets eating up much of their TSMC wafer capacity, and on a per wafer basis, Zen 3 is several times as profitable as consumer GPUs.
  • achesomaachesoma Member RarePosts: 1,768
    Why would someone buy this over a 3070 with only a difference of $100? (Other than availability)
    Preaching Pantheon to People at PAX  PAX East 2018 Day 4 - YouTube
  • ForgrimmForgrimm Member EpicPosts: 3,069

    Quizzical said:

    But the real question is when you'll actually be able to buy the cards, not when the launch day will be.



    Probably sometime around early Spring of 2021.

  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    achesoma said:
    Why would someone buy this over a 3070 with only a difference of $100? (Other than availability)

    A $100 price difference is a significant reason.  If you ignore that, then you might as well ask why anyone would ever buy anything other than the top of the line.
    strawhat0981goozmania
  • GorweGorwe Member Posts: 1,593
    Why is AMD represented with 6800s? They're at least a class above 3060Ti. Some other AMDATi belongs on this list, from the (lower) middle class. Some x600?
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Gorwe said:
    Why is AMD represented with 6800s? They're at least a class above 3060Ti. Some other AMDATi belongs on this list, from the (lower) middle class. Some x600?
    This card slots in between a Radeon RX 6800 and a Radeon RX 5700 XT.  Both of those cards are included in the review, and they're the nearest comparables from AMD.  You might argue that some future Radeon RX 6700 or RX 6600 or some such would be a better comparable, but that future card hasn't yet launched and therefore cannot be included.
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    remsleep said:
    Mars_OMG said:
    not really affordable for mid tier gamers. DLSS and Ray tracing are novelties, that most gamers lived without for over 25 years, saying they are actual features, is laughable. Disco is dead... :)

    When major blockbuster games like Cyberpunk2077, as well as Minecraft and Fortnite have all ray tracing support - it's not a novelty.

    Also when you have both Nvidia and AMD add ray tracing support in hardware, again that's not a novelty.

    Ray Tracing is the future of video game graphics, as that is the best looking, most immersive way to render lighting/shadows.

    What is laughable - people who discount ray tracing as some gimmicky tech - when in fact it produces the best image quality, always has been - but it was far too expensive to be done in real time until very recently.

    Ray Tracing is here to stay.


    Also humans have lived without electricity for 1000s of years - your argument about 25 years of gaming without ray tracing is such shit, like going backwards to something inferior is better?

    We've lived without modern medicine for ages ... lets go back to blood letting and drilling holes in people's craniums, yeah!
    At the moment, real-time ray tracing pretty much is a novelty.  A handful of games using it to do a little bit while still mostly relying on rasterization is hardly the goal.  Quake II RTX is the only game I'm aware of to go full ray-tracing as opposed to just doing a little bit with it, and that's a remaster of a game that originally released in 1997.  It's probably going to become a lot more common in coming years, but it's not there yet.

    The problem is that by the time real-time ray-tracing is common enough for most gamers to actually care about it, neither a GeForce RTX 3090 nor a Radeon RX 6900 XT is likely to be powerful enough to handle the ray-tracing that you want to use in newer games.  Now, if you want to be on the bleeding edge and get excited about a handful of games doing a little bit of ray-tracing, then go ahead.  But I don't think that's a very large niche.

    For comparison, the first game I played that relied heavily on rasterization was StarFox, in 1993.  The first game I played a lot (as opposed to briefly) that relied heavily on rasterization was A Tale in the Desert, in 2003.  That's quite a gap, and hardware was advancing much faster then than it is today.  Depending on how long the fabs can continue advancing, it's quite possible that the heavy ray-tracing future that people imagine will never arrive.

    That's not to say that GPU vendors shouldn't be pushing ray-tracing.  It has to start somewhere, and if you don't start with immature implementations and work on improving them, that ray-tracing future will never arrive.  But for now, it's a small niche.
    [Deleted User]Gorwe
  • VrikaVrika Member LegendaryPosts: 7,973
    Quizzical said:
    remsleep said:
    Mars_OMG said:
    not really affordable for mid tier gamers. DLSS and Ray tracing are novelties, that most gamers lived without for over 25 years, saying they are actual features, is laughable. Disco is dead... :)

    When major blockbuster games like Cyberpunk2077, as well as Minecraft and Fortnite have all ray tracing support - it's not a novelty.

    Also when you have both Nvidia and AMD add ray tracing support in hardware, again that's not a novelty.

    Ray Tracing is the future of video game graphics, as that is the best looking, most immersive way to render lighting/shadows.

    What is laughable - people who discount ray tracing as some gimmicky tech - when in fact it produces the best image quality, always has been - but it was far too expensive to be done in real time until very recently.

    Ray Tracing is here to stay.


    Also humans have lived without electricity for 1000s of years - your argument about 25 years of gaming without ray tracing is such shit, like going backwards to something inferior is better?

    We've lived without modern medicine for ages ... lets go back to blood letting and drilling holes in people's craniums, yeah!
    At the moment, real-time ray tracing pretty much is a novelty.  A handful of games using it to do a little bit while still mostly relying on rasterization is hardly the goal.  Quake II RTX is the only game I'm aware of to go full ray-tracing as opposed to just doing a little bit with it, and that's a remaster of a game that originally released in 1997.  It's probably going to become a lot more common in coming years, but it's not there yet.

    The problem is that by the time real-time ray-tracing is common enough for most gamers to actually care about it, neither a GeForce RTX 3090 nor a Radeon RX 6900 XT is likely to be powerful enough to handle the ray-tracing that you want to use in newer games.  Now, if you want to be on the bleeding edge and get excited about a handful of games doing a little bit of ray-tracing, then go ahead.  But I don't think that's a very large niche.

    For comparison, the first game I played that relied heavily on rasterization was StarFox, in 1993.  The first game I played a lot (as opposed to briefly) that relied heavily on rasterization was A Tale in the Desert, in 2003.  That's quite a gap, and hardware was advancing much faster then than it is today.  Depending on how long the fabs can continue advancing, it's quite possible that the heavy ray-tracing future that people imagine will never arrive.

    That's not to say that GPU vendors shouldn't be pushing ray-tracing.  It has to start somewhere, and if you don't start with immature implementations and work on improving them, that ray-tracing future will never arrive.  But for now, it's a small niche.
    Ray-tracing is the future of video games because in near future we'll get a lot more games that use it.

    It doesn't need to replace other techniques. Current games with ray-tracing have already shown that it works well enough in combination with them.
    [Deleted User]
     
  • Jamar870Jamar870 Member UncommonPosts: 573
    As to Remsleep's remark, when I see more games with this then I'll start thinking it's mainstream.  By more I mean like 8 or more well known titles.
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Vrika said:
    Quizzical said:
    At the moment, real-time ray tracing pretty much is a novelty.  A handful of games using it to do a little bit while still mostly relying on rasterization is hardly the goal.  Quake II RTX is the only game I'm aware of to go full ray-tracing as opposed to just doing a little bit with it, and that's a remaster of a game that originally released in 1997.  It's probably going to become a lot more common in coming years, but it's not there yet.

    The problem is that by the time real-time ray-tracing is common enough for most gamers to actually care about it, neither a GeForce RTX 3090 nor a Radeon RX 6900 XT is likely to be powerful enough to handle the ray-tracing that you want to use in newer games.  Now, if you want to be on the bleeding edge and get excited about a handful of games doing a little bit of ray-tracing, then go ahead.  But I don't think that's a very large niche.

    For comparison, the first game I played that relied heavily on rasterization was StarFox, in 1993.  The first game I played a lot (as opposed to briefly) that relied heavily on rasterization was A Tale in the Desert, in 2003.  That's quite a gap, and hardware was advancing much faster then than it is today.  Depending on how long the fabs can continue advancing, it's quite possible that the heavy ray-tracing future that people imagine will never arrive.

    That's not to say that GPU vendors shouldn't be pushing ray-tracing.  It has to start somewhere, and if you don't start with immature implementations and work on improving them, that ray-tracing future will never arrive.  But for now, it's a small niche.
    Ray-tracing is the future of video games because in near future we'll get a lot more games that use it.

    It doesn't need to replace other techniques. Current games with ray-tracing have already shown that it works well enough in combination with them.
    Is SMAA the future of video games because there will be more games that use it in the future?  Is tessellation the future?  Is ambient occlusion the future?  Are sprites the future, since there are still people making games with RPG Maker?  Are they all the future?  What does that even mean?

    I think it's very important to distinguish between a game that uses a little bit of ray-tracing as one graphical feature among many and a game that relies so heavily on ray-tracing that the game can't even be played without it.  The former is incremental changes, and could reasonably be called a novelty.  The latter is a revolution.
  • VrikaVrika Member LegendaryPosts: 7,973
    edited December 2020
    Quizzical said:
    Vrika said:
    Quizzical said:
    At the moment, real-time ray tracing pretty much is a novelty.  A handful of games using it to do a little bit while still mostly relying on rasterization is hardly the goal.  Quake II RTX is the only game I'm aware of to go full ray-tracing as opposed to just doing a little bit with it, and that's a remaster of a game that originally released in 1997.  It's probably going to become a lot more common in coming years, but it's not there yet.

    The problem is that by the time real-time ray-tracing is common enough for most gamers to actually care about it, neither a GeForce RTX 3090 nor a Radeon RX 6900 XT is likely to be powerful enough to handle the ray-tracing that you want to use in newer games.  Now, if you want to be on the bleeding edge and get excited about a handful of games doing a little bit of ray-tracing, then go ahead.  But I don't think that's a very large niche.

    For comparison, the first game I played that relied heavily on rasterization was StarFox, in 1993.  The first game I played a lot (as opposed to briefly) that relied heavily on rasterization was A Tale in the Desert, in 2003.  That's quite a gap, and hardware was advancing much faster then than it is today.  Depending on how long the fabs can continue advancing, it's quite possible that the heavy ray-tracing future that people imagine will never arrive.

    That's not to say that GPU vendors shouldn't be pushing ray-tracing.  It has to start somewhere, and if you don't start with immature implementations and work on improving them, that ray-tracing future will never arrive.  But for now, it's a small niche.
    Ray-tracing is the future of video games because in near future we'll get a lot more games that use it.

    It doesn't need to replace other techniques. Current games with ray-tracing have already shown that it works well enough in combination with them.
    Is SMAA the future of video games because there will be more games that use it in the future?  Is tessellation the future?  Is ambient occlusion the future?  Are sprites the future, since there are still people making games with RPG Maker?  Are they all the future?  What does that even mean?
    I'd interpret future tech as something that wasn't common in the past but will be common in the future.

    Real-time ray-tracing in games was almost non-existent in the past, but now we've got working tech, and with its support by the new console generation it looks like it's becoming common. Also it's not getting used just by some marginal games, but we're getting big-name mainstream games like Battlefield, Call of Duty, Cyberpunk and World of Warcraft with ray-tracing support.
    [Deleted User]
     
  • VrikaVrika Member LegendaryPosts: 7,973
    Quizzical said:

    I think it's very important to distinguish between a game that uses a little bit of ray-tracing as one graphical feature among many and a game that relies so heavily on ray-tracing that the game can't even be played without it.  The former is incremental changes, and could reasonably be called a novelty.  The latter is a revolution.
    I disagree.

    The few fully ray-traced games at the moment are just a novelty (even if their tech is kind of revolutionary). Whereas using ray-tracing in combination with other techniques is becoming too commonplace to be called a novelty.
    [Deleted User]
     
  • VrikaVrika Member LegendaryPosts: 7,973
    edited December 2020
    RTX 3060 Ti pre-launch supply numbers from retailer Proshop:

    Proshop ordered total of 5 637 RTX 3060 Ti cards from manufacturers, and yesterday before launch 195 of them were delivered - only 3% of the cards they had ordered.

    Source: https://www.proshop.de/RTX-30series-overview

    It's yet another launch with so low inventory that getting a card is almost impossible.
    [Deleted User][Deleted User]
     
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    At this point, the stock levels are getting ridiculous.  There were obvious reasons why the RTX 3080 and 3090 and RX 6000 series would be in very short supply.  But the RTX 3060 Ti and 3070 should be fairly routine parts, and those aren't showing up, either.  I'd assume that they're going to show up in huge volumes very soon, as otherwise, something is severely wrong at either Samsung or Nvidia.

    The Ryzen 5000 series has had considerable stock, in contrast.  I've seen it appear in stock, albeit well above MSRP, several times on New Egg.  The Ryzen 7 5800X is in stock at New Egg from two different suppliers right now, even, albeit at $660 or $581 (as compared to a $450 MSRP).  I've never seen any of the new generation video cards in stock on New Egg at any price.
    [Deleted User]
  • ForgrimmForgrimm Member EpicPosts: 3,069
    I was checking NewEgg just for shiggles. It went from no listings showing up for the 3060ti, to all of the listings showing up, but being sold out lol. 
    [Deleted User][Deleted User]
  • VrikaVrika Member LegendaryPosts: 7,973
    Quizzical said:
    At this point, the stock levels are getting ridiculous.  There were obvious reasons why the RTX 3080 and 3090 and RX 6000 series would be in very short supply.  But the RTX 3060 Ti and 3070 should be fairly routine parts, and those aren't showing up, either.  I'd assume that they're going to show up in huge volumes very soon, as otherwise, something is severely wrong at either Samsung or Nvidia.

    The Ryzen 5000 series has had considerable stock, in contrast.  I've seen it appear in stock, albeit well above MSRP, several times on New Egg.  The Ryzen 7 5800X is in stock at New Egg from two different suppliers right now, even, albeit at $660 or $581 (as compared to a $450 MSRP).  I've never seen any of the new generation video cards in stock on New Egg at any price.
    Ethereum is upgrading to version 2.0 of their blockchain and a lot of Ethereum miners need to buy more powerful GPUs to mine that. Also Ethereum prices have risen a lot this year.

    There's no confirmed info, but there are rumors that NVidia is selling a lot of the new GPUs directly to crypto farms.
    [Deleted User]
     
  • ForgrimmForgrimm Member EpicPosts: 3,069
    Vrika said:
    Quizzical said:
    At this point, the stock levels are getting ridiculous.  There were obvious reasons why the RTX 3080 and 3090 and RX 6000 series would be in very short supply.  But the RTX 3060 Ti and 3070 should be fairly routine parts, and those aren't showing up, either.  I'd assume that they're going to show up in huge volumes very soon, as otherwise, something is severely wrong at either Samsung or Nvidia.

    The Ryzen 5000 series has had considerable stock, in contrast.  I've seen it appear in stock, albeit well above MSRP, several times on New Egg.  The Ryzen 7 5800X is in stock at New Egg from two different suppliers right now, even, albeit at $660 or $581 (as compared to a $450 MSRP).  I've never seen any of the new generation video cards in stock on New Egg at any price.
    Ethereum is upgrading to version 2.0 of their blockchain and a lot of Ethereum miners need to buy more powerful GPUs to mine that. Also Ethereum prices have risen a lot this year.

    There's no confirmed info, but there are rumors that NVidia is selling a lot of the new GPUs directly to crypto farms.
    According to financial analysts, the estimate is that Nvidia sold around $175 million worth of RTX 30 series cards directly to crypto-miners. https://wccftech.com/nvidia-allegedly-sold-175-million-worth-ampere-geforce-rtx-30-gpus-to-miners/
    [Deleted User]
  • laseritlaserit Member LegendaryPosts: 7,591
    Quizzical said:
    At this point, the stock levels are getting ridiculous.  There were obvious reasons why the RTX 3080 and 3090 and RX 6000 series would be in very short supply.  But the RTX 3060 Ti and 3070 should be fairly routine parts, and those aren't showing up, either.  I'd assume that they're going to show up in huge volumes very soon, as otherwise, something is severely wrong at either Samsung or Nvidia.

    The Ryzen 5000 series has had considerable stock, in contrast.  I've seen it appear in stock, albeit well above MSRP, several times on New Egg.  The Ryzen 7 5800X is in stock at New Egg from two different suppliers right now, even, albeit at $660 or $581 (as compared to a $450 MSRP).  I've never seen any of the new generation video cards in stock on New Egg at any price.
    I picked up a 5950X a few days ago at New Egg Canada for $899 cad. That works out to about $700 usd. I also picked up an RTX 3090 from Amazon three days ago for $200 above msrp. I couldn’t get the ram I wanted and had to settle for something slightly slower.

    "Be water my friend" - Bruce Lee

  • volttvoltt Member UncommonPosts: 432
    Got my Asus dual 3060 ti card today. Brick and mortar stores ftw. They had more stock then expected with about 20-25 cards in stock about 5 per model for this one store
    Forgrimm
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Forgrimm said:
    Vrika said:
    Quizzical said:
    At this point, the stock levels are getting ridiculous.  There were obvious reasons why the RTX 3080 and 3090 and RX 6000 series would be in very short supply.  But the RTX 3060 Ti and 3070 should be fairly routine parts, and those aren't showing up, either.  I'd assume that they're going to show up in huge volumes very soon, as otherwise, something is severely wrong at either Samsung or Nvidia.

    The Ryzen 5000 series has had considerable stock, in contrast.  I've seen it appear in stock, albeit well above MSRP, several times on New Egg.  The Ryzen 7 5800X is in stock at New Egg from two different suppliers right now, even, albeit at $660 or $581 (as compared to a $450 MSRP).  I've never seen any of the new generation video cards in stock on New Egg at any price.
    Ethereum is upgrading to version 2.0 of their blockchain and a lot of Ethereum miners need to buy more powerful GPUs to mine that. Also Ethereum prices have risen a lot this year.

    There's no confirmed info, but there are rumors that NVidia is selling a lot of the new GPUs directly to crypto farms.
    According to financial analysts, the estimate is that Nvidia sold around $175 million worth of RTX 30 series cards directly to crypto-miners. https://wccftech.com/nvidia-allegedly-sold-175-million-worth-ampere-geforce-rtx-30-gpus-to-miners/
    While I have no inside information here, I'd regard that as extremely unlikely on several grounds.

    First is that it would imply that Nvidia has had $175 million worth of RTX 3000 series cards.  That's hundreds of thousands of cards, and would massively dwarf the amounts that have actually reached retail.  That's not impossible, but I do think it's improbable.

    Second is that it would imply that Nvidia preferred to sell cards to miners rather than gamers.  That would be true if the miners were willing to pay professional card prices (formerly branded Tesla or Quadro), but the miners aren't.  If it's GeForce cards, they prefer to sell to gamers if they can.  They might prefer miners for GPU dies that can't be sold to gamers (e.g., if the video decode block is defective, miners don't care), or happily sell additional cards to miners once gamers have bought all that they want.

    Third is that miners might not even want the cards.  Remember that gaming doesn't scale well to many GPUs, but mining sure does.  The RTX 3090 and Radeon RX 6900 are priced such that getting you more, cheaper cards will probably get you better performance for the same money as getting fewer, higher end cards.

    At least in its original incarnation, Ethereum was very memory heavy, and by design.  It actually used very little compute, but mostly leaned on memory bandwidth.  Assuming that's still the case, a Radeon RX 570 8 GB should offer more than half of the performance of a GeForce RTX 3070, and for less than half of the price.  With the former readily available for $200, I'd think that the latter shouldn't be that interesting to miners at $500.

    Ethereum was specifically designed to be memory hard rather than compute heavy, as the latter will inevitably be dominated by ASICs.  They wanted Ethereum to be mined by many people with ordinary consumer hardware, not just a handful of people who buy a zillion ASICs.

    You can make an algorithm require at least some amount of memory, so that cards with less memory than that basically can't do it, or at least, not in a competitive manner.  But the RX 570 8 GB has the same amount of memory as the RTX 3070, so if the latter can do the algorithm, the former should, too.  And if they designed the algorithm such that 8 GB isn't enough, then the RTX 3070 shouldn't be interesting to miners.
    [Deleted User]
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Now that I've looked into it, it appears that Ethereum 2.0 can't be mined at all.  Rather, part of the point is moving from proof of work (i.e., proof that someone did massive amounts of mining) to proof of stake.  So that shouldn't lead to massive GPU purchases by miners.
    [Deleted User]
  • VrikaVrika Member LegendaryPosts: 7,973
    Quizzical said:
    Now that I've looked into it, it appears that Ethereum 2.0 can't be mined at all.  Rather, part of the point is moving from proof of work (i.e., proof that someone did massive amounts of mining) to proof of stake.  So that shouldn't lead to massive GPU purchases by miners.
    Looks like you're correct. I guess it was just a false rumor/stupid analyst then.

    [Deleted User]
     
  • ForgrimmForgrimm Member EpicPosts: 3,069
    Vrika said:
    Quizzical said:
    Now that I've looked into it, it appears that Ethereum 2.0 can't be mined at all.  Rather, part of the point is moving from proof of work (i.e., proof that someone did massive amounts of mining) to proof of stake.  So that shouldn't lead to massive GPU purchases by miners.
    Looks like you're correct. I guess it was just a false rumor/stupid analyst then.

    While the analyst's claims may not be true, the fact that Etherium is making the transition to 2.0 this month doesn't necessarily mean that a big chunk of cards weren't sold to miners. 
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Forgrimm said:
    Vrika said:
    Quizzical said:
    Now that I've looked into it, it appears that Ethereum 2.0 can't be mined at all.  Rather, part of the point is moving from proof of work (i.e., proof that someone did massive amounts of mining) to proof of stake.  So that shouldn't lead to massive GPU purchases by miners.
    Looks like you're correct. I guess it was just a false rumor/stupid analyst then.

    While the analyst's claims may not be true, the fact that Etherium is making the transition to 2.0 this month doesn't necessarily mean that a big chunk of cards weren't sold to miners. 
    It's still unlikely that Nvidia intentionally sold most of the cards to miners, for the reasons I listed above.

    And it's not like wccftech is scrupulous about caring whether a rumor is true before they publish it.  They're probably just quoting some random person who is guessing.
    [Deleted User]
  • ForgrimmForgrimm Member EpicPosts: 3,069
    Quizzical said:
    Forgrimm said:
    Vrika said:
    Quizzical said:
    Now that I've looked into it, it appears that Ethereum 2.0 can't be mined at all.  Rather, part of the point is moving from proof of work (i.e., proof that someone did massive amounts of mining) to proof of stake.  So that shouldn't lead to massive GPU purchases by miners.
    Looks like you're correct. I guess it was just a false rumor/stupid analyst then.

    While the analyst's claims may not be true, the fact that Etherium is making the transition to 2.0 this month doesn't necessarily mean that a big chunk of cards weren't sold to miners. 
    It's still unlikely that Nvidia intentionally sold most of the cards to miners, for the reasons I listed above.

    And it's not like wccftech is scrupulous about caring whether a rumor is true before they publish it.  They're probably just quoting some random person who is guessing.
    Yeah I have somewhat of a tough time believing that Nvidia would have done mass direct sales to miners also. The 30 series cards were being hyped up for mining purposes before release, and the 3080 in particular was receiving a lot of attention for its mining potential. So I have not doubt that many of the new cards were bought out by crypto-miners, but it would seem odd for Nvidia to do direct sales that way.
Sign In or Register to comment.