Just picked up a rx480 and it's pretty sweet. My gaming rig I started working on about a year ago is finally complete.
I had bad experiences with Radeon cards in the past (the distant past, to be fair, tech-speaking). But after reading up and arguing with people here, it seemed like they had cleaned up their act. And the card seems pretty sweet- BDO looks nice, Overwatch and TERA are both pretty much maxxed out. Dark Souls 3 is really pretty.
The reason I picked this over say a 1060 is mostly because of G-Sync. G-Sync monitors are super expensive. I picked up a freesync monitor not too long ago for 144hz/1ms response time for Overwatch. I would've gladly picked up a g-sync but to pay so much more just for that feature seemed insane.
Since I already had freesync, it seemed natural to just get an AMD card to make use of it. Plus 8gb over 6gb memory future proofs the card a bit more.
Anyway, long story short- Nvidia drove me away with their crazy gsync pricing. And the 480 seems pretty nice.
Comments
Welcome!
Velika: City of Wheels: Among the mortal races, the humans were the only one that never built cities or great empires; a curse laid upon them by their creator, Gidd, forced them to wander as nomads for twenty centuries...
AMD has a corporate philosophy of embracing and advocating industry standards. Nvidia has a corporate philosophy of pushing proprietary solutions that offer vendor lock-in. G-sync is a recent example of this, but there are many others, from CUDA to PhysX to SLI motherboards to Cg.
Some years ago, the monitor industry got together and realized that something like G-Sync/FreeSync was practical. So they created an industry standard called adaptive sync. The idea was that you could get any monitor and any GPU and you could use adaptive sync.
Nvidia saw this coming and decided to create their own proprietary version of it called G-sync. It's possible to do things faster if you don't need compatibility with everyone else, so Nvidia announced G-sync as if it were there own pet project and adaptive sync weren't going to exist. AMD quickly said, hey, there's an industry standard that does the same thing coming, and we're going to support it. AMD had long known this, of course, but just hadn't felt the need to announce it publicly until Nvidia jumped the gun with G-sync.
Because Nvidia was trying to take shortcuts, in order for G-sync to work, a monitor has to have a physical module provided by Nvidia that costs about $100. Once various markups are taken, this adds about $150 to the retail cost of a monitor, or more if you consider that they're low volume products by virtue of most people not wanting to pay that extra $150.
FreeSync is just AMD's implementation of adaptive sync. In order to support adaptive sync, a monitor vendor doesn't have to buy special hardware from AMD. You just buy good quality components from the same monitor component vendors you've been buying them from for years, as the latest versions will support adaptive sync. I know that the scalers are important to adaptive sync, but I don't know if any other components need anything special. Low end, cheap junk monitors and older components won't support it, but if you're building a good quality monitor from modern components, it will happen to support adaptive sync, so you might as well claim FreeSync support since it's an extra marketing bullet point and costs you nothing.
Thus, FreeSync monitors are much cheaper than an otherwise identical G-sync monitor. For people who aren't immediately going to use either, getting a FreeSync monitor over the G-sync version is an easy and obvious choice. G-sync was available several months before adaptive sync, so there was a period of time where if you wanted the functionality, G-sync was the only option. But we're now nearly two years into the era of adaptive sync monitors being available, and buying a G-sync monitor is looking increasingly ridiculous.
Since G-sync does basically the same thing as adaptive sync, it's likely that Nvidia could support the latter with a simple driver update. If that's not the case, then they really botched their design of both Maxwell and Pascal, as they knew adaptive sync was coming. Indeed, that's why they rushed out G-sync when they did. It's possible that they created G-sync precisely because they realized they had botched their Maxwell design to not support adaptive sync, but that wouldn't explain repeating that mistake with Pascal.
While AMD is currently the only GPU vendor to support adaptive sync, it's not going to stay that way forever. Intel has already said that they will support it, and I wouldn't be surprised if that support comes as soon as Kaby Lake. For that matter, some mobile GPU vendors could already support it without me being aware of it; the original justification was reducing power consumption by allowing a monitor to not refresh as often when it didn't have to, and that's a big deal in mobile devices.
I've long believed that Nvidia would support adaptive sync. For that matter, I'm surprised that they haven't yet. It would be suicidal not to. Holdenhamlet isn't the first person who would have bought a GeForce card, but bought a Radeon card instead because of Nvidia's refusal to support adaptive sync, and won't be the last. At this point, I somewhat expect Nvidia to leave Maxwell GPU owners high and dry when they do get around to supporting adaptive sync by not porting it back as far as they could have.
I'll leave it to the rabid Nvidia fanboys to explain how this proves that Nvidia's drivers are awesome and AMD's are terrible. That's actually not as sarcastic of a statement as you might think, as there are quite a few people in the world who view Nvidia's push of proprietary approaches and reluctance to support industry standards as being a good thing, not a bad thing. As you can tell, I'm not one of them.