It looks like you're new here. If you want to get involved, click one of these buttons!
Just finished reading this interview with Richard Huddy, AMD's developer's relations guy, and thought a few tech heads might be interested. It's a really good but long interview and covers quite a few things including Arkham, Larabee, the problems ATI had with Saboteur and (kinda) their plans for physics.
http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/1
I haven't bought anything from AMD or ATI in a few years now but after reading this that might change:
"That's the difference in the way [AMD] works - we work through enablement and open standards. [Nvidia] works through closed standards and disablement, which, to me is inexcusable; it's as bad as that." - Richard Huddy, AMD Developer Relations
Played in some form:
UO til tram, AC, EQ, AO, WW2O, PS, SB, CoH, AC2, Hor, LoTRO, DDO, AoC, Aion, CO, STO
Playing: WoW (for gf), WAR
Waiting For: SWTOR, FFXVI
Hoping For: DCUO, Secret World, Earthrise
-S- (UO Sonoma)
Comments
I have certainly bought Nvidia cards (most recently was a G80 Geforce 8800 a few years back), but what AMD's guy is saying there is actually true. Nvidia has used closed standards and practices designed to restrict technology time and time again, hurting the industry in the process.
Recently, their approach to DX10/10.1 comes to mind. DX10 was a failure in many ways, because although it enabled new visual features, it was slower than DirectX9, and so it made tradeoffs that were generally unappealing in terms of performance and visual quality. DirectX10.1/SM4.1 remedied this greatly by improving anti-aliasing performance enormously. The problem was that Nvidia's cards didn't support DX10.1. In fact, it took them over two years to catch up to AMD-Ati in that respect. Rather than working harder to get a DX10.1 part on the market, Nvidia launched an assault in DX10.1, starting with the famous Assassin's Creed debacle, where a patch in the game (a game made by a developer who was in a co-advertising arrangement with Nvidia) removed DX10.1 support. Thanks in part to Nvidia, DX10 generation engines essentially never got off the ground until very recently, and at this point DX11 is already out. Now, Nvidia is trying to do the same thing to DX11, trying to convince people to ignore DX11 in an attempt to undermine it rather than work harder to get a DX11 part on the market (namely, the almost fabled Geforce G300 series).
Nvidia absolutely works to undermine new technologies as their primary means to compete in the market; they close off new technologies and make the market narrower and stagnant rather than broadening and advancing the market by pushing new technologies.
They are also the master of closed standards. Whereas AMD-Ati has pushed open standards like OpenCL and Direct Compute (now part of DX11), Nvidia has done two things with GPU computing: attempted to corner the market with Physx, and attempted to ensure that no one but them gets Physx. They've done the same thing with CUDA. I'm sure that AMD would attempt to corner the market if one of their technologies suddenly became dominant, but the point is that they don't actively pursue that as their primary means to competition. They instead rely on innovation. AMD has always been one of the greatest innovators out there. In fact, the two biggest revolutions in CPU design were both AMD-created: the x64 instruction set and integrated memory controllers. On the GPU market they've always been ahead of the game too, and not just with APIs, which they always support first, but even with hardware design, as seen in the Radeon HD 4000 series, which beat out Nvidia's monolithic-GPU-of-doom design on the G200 with smaller, but more scalable GPUs.
I'm certainly no die-hard AMD/Ati fan who can't buy anything else (my last rig was a Core 2 Duo E8400 based system with a G80 Geforce 8800GTS), but I certainly do like AMD as a company; there's no denying that.
NVidia's practices have come back to haunt them, their cards were the best in my opinion because they had good drivers and worked with everything. However it was at the cost of innovation, unless it was owned by nVidia! They steam rolled manufacturers and games studios into doing what they wanted and hurting their competition. Bribing or blackmailing studios into designing games to run shitty on ATI GPUs and making them as unstable as possible on ATI drivers.
Well now that iron grip on the industry has weakened, alot. As Princess Leia would say "The tighter you squeeze the more of the market share will slip through your fingers!". Even today nVidia is concentrating on redundent tech and gimmicks to sell their cards whilst bashing DX11. Well it won't work now, Win7 has DX11 with it and every gamer that has a clue wants DX11 yesterday and fuck nVidia if they don't like it.
NVidia are going to have to get their shit together and make some decent cards instead of rebranding the 8800 ad nausea and advertising gimmicks. The 8800 was a turning point in my opinion in the history of GPU making, it was (and still is to some extent) an awesome GPU. No other GPU has had such staying power in performance and held its price well over so many years. Until nVidia go back to those days ATI will be riding them like the bitches they are, I'd rather have my ATI 5770 with DX11 at a lower price than a GTX 260 but better performance.
I think nVidia are going to learn a very painfull lesson this year and they're going to be forced to weather the storm. Rebranded 8800's just aren't going to compete in todays market and they won't be able to hold back those gamers who're screaming for DX11.
"Of all tyrannies, a tyranny sincerely exercised for the good of its victims may be the most oppressive. It would be better to live under robber barons than under omnipotent moral busybodies. The robber baron's cruelty may sometimes sleep, his cupidity may at some point be satiated; but those who torment us for our own good will torment us without end for they do so with the approval of their own conscience"
CS Lewis
http://www.semiaccurate.com/2009/09/15/nvidia-gt300-yeilds-under-2/
http://www.semiaccurate.com/2009/10/06/nvidia-kills-gtx285-gtx275-gtx260-abandons-mid-and-high-end-market/
I didn't realize the situation was that dire for Nvidia, though I had guess it might be due to the never-ending pushbacks of the G300 series, which was supposed to be out last year, and hasn't had so much as one solid peep let out about it.
If it really is as bad as this site is indicating, and Nvidia is really withdrawing from the market so much, this could be a prelude to financial collapse for the company.
This article:
http://www.brightsideofnews.com/news/2009/11/4/batmangate-amd-vs-nvidia-vs-eidos-fight-analyzed.aspx
Goes into pretty good detail about the Batman Anti-aliasing/Physx mess and is an example of some of the things Richard Huddy is touching on, pretty good read.
Basically when RockSteady was making Batman, AMD assumed they were going to use standard AA code. Instead, nVidia wrote the AA code for them just before release and added a no-AMD flag.
AMD saw it was standard AA code and told RockSteady they could just remove the flag for it to work but Eidos's legal dept told them they couldn't touch the code since nVidia had written it, invited AMD to write their own AA code but seems AMD didn't do it because the existing AA code was industry standard DirectX anti-aliasing and would've been no different than their own.
Course AMD should've taken a day out to have someone copy-paste their AA code into the engine but nVidia shouldn't be paying devs to add in standard code with a proprietary 'nvidia only' flag either.
I've seen some people compare this to AMD's involvement with Dirt 2 adding DX11 but the difference is DX11 is completely open and anyone can support it, there's no -no_nvidia switch, they just need to get their hardware out the door.
All nice and whatever, but until I can slap an Ati card in my comp, install one easy dedicated driver, and be done with it, I'll stick to Nvidia.
I have always used Nvidia up until a month ago when my card burned up ( it was just it's time after 2+ years ). SO I thought I'd try an Ati card and bought an HD3850. Worse graphics card experience ever. Had to go through multiple driver versions + hotfixes just to find one to run it stable. So I said screw it, and went back to Nvidia. Totally hassle free. The Ati card is now in some sucker on Craigslist's hands.
]
The 3850 might be an archaic card (it's now two generations old), but even so, if the card functions then it's simply a matter of downloading and installing the latest Catalyst driver package. Just pop in the card, install the driver, and you're done. My 4870 was even easier than that with Windows 7. My Windows XP partition required me to install Catalyst myself (a process that took all of 5 minutes), but on 7 the OS detected and installed the card automatically, and then Windows update installed the latest drivers for it.
I have no idea what all these supposed stability problems of yours are, but that's hardly a typical experience. In fact, my Radeon HD 4870 actually corrected a few problems my Geforce 8800 suffered from (most notably, Nvidia 17x.xx drivers didn't support anti-aliasing in Battlefield 2142).
When my dual 5770s arrive tomorrow for Crossfire, it'll be the same exact process, minus the driver installation, as my present driver set should recognize the new cards without trouble.
Yea idk, I tossed a 3870 into a friend's PC I'm upgrading just to test out the PSU before I bought a used 8800 GTS for him and the drivers worked easy.
The 3850 might be an archaic card (it's now two generations old), but even so, if the card functions then it's simply a matter of downloading and installing the latest Catalyst driver package. Just pop in the card, install the driver, and you're done. My 4870 was even easier than that with Windows 7. My Windows XP partition required me to install Catalyst myself (a process that took all of 5 minutes), but on 7 the OS detected and installed the card automatically, and then Windows update installed the latest drivers for it.
I have no idea what all these supposed stability problems of yours are, but that's hardly a typical experience. In fact, my Radeon HD 4870 actually corrected a few problems my Geforce 8800 suffered from (most notably, Nvidia 17x.xx drivers didn't support anti-aliasing in Battlefield 2142).
When my dual 5770s arrive tomorrow for Crossfire, it'll be the same exact process, minus the driver installation, as my present driver set should recognize the new cards without trouble.
Well, see. I'm talking about the AGP HD3850, not the PCI-E version. The AGP model is newer, although in model number versions it is old. But the problems weren't just with the drivers. ATI support is a joke. The cards are made by outside vendors ( Sapphire, in my case ). They have ATI slapped all over them, use ATI drivers, and ATI gets money from the cards. But ask a question or try to get help from ATI, and you get "Sorry, we don't support AGP versions of our cards". Yet ATI makes the damn drivers and hotfixes for them!!!! But yet, they'll take credit for "supporting" AGP when Nvidia stopped.
As far as "hardly a typical experience", where have you been, under a rock? ATI is best known for it's shitty drivers moreso than their cards themselves.
]
Hardware experience vary. Personally, I have never had a problem with ATI but have had numerous problems with nVidia. The HD3850 AGP is a bad investment as the AGP slot cannot even fully utilize the card. I would say since the AMD acquisition of ATI, nVidia has had more news worthy problems then ATI and has updated drivers less frequently. Whenever we encounter a hardware issue we just plain don't shop with that brand anymore which is fine.
The thing I picked up from this interview is that ATI supports open architectures so it doesn't have to invest R&D into supporting its own formats. This frees it up to invest into better technology. Its had the technological advantage since the Radeon 9xxx which is something.
The 3850 might be an archaic card (it's now two generations old), but even so, if the card functions then it's simply a matter of downloading and installing the latest Catalyst driver package. Just pop in the card, install the driver, and you're done. My 4870 was even easier than that with Windows 7. My Windows XP partition required me to install Catalyst myself (a process that took all of 5 minutes), but on 7 the OS detected and installed the card automatically, and then Windows update installed the latest drivers for it.
I have no idea what all these supposed stability problems of yours are, but that's hardly a typical experience. In fact, my Radeon HD 4870 actually corrected a few problems my Geforce 8800 suffered from (most notably, Nvidia 17x.xx drivers didn't support anti-aliasing in Battlefield 2142).
When my dual 5770s arrive tomorrow for Crossfire, it'll be the same exact process, minus the driver installation, as my present driver set should recognize the new cards without trouble.
Well, see. I'm talking about the AGP HD3850, not the PCI-E version. The AGP model is newer, although in model number versions it is old. But the problems weren't just with the drivers. ATI support is a joke. The cards are made by outside vendors ( Sapphire, in my case ). They have ATI slapped all over them, use ATI drivers, and ATI gets money from the cards. But ask a question or try to get help from ATI, and you get "Sorry, we don't support AGP versions of our cards". Yet ATI makes the damn drivers and hotfixes for them!!!! But yet, they'll take credit for "supporting" AGP when Nvidia stopped.
As far as "hardly a typical experience", where have you been, under a rock? ATI is best known for it's shitty drivers moreso than their cards themselves.
This is truly some great stuff! You're supposedly comparing Ati to Nvidia, compaining that Ati doesn't support their AGP boards well enough, and yet you freely admit that you couldn't even get a comparable AGP board from Nvidia (Newegg's fastest Nvidia AGP board is only a Geforce 7200GS with GDDR2 memory). So, given your AGP options, what you're basically doing is criticizing Ati for not having the support Nvidia does, when Nvidia doesn't have support of any kind, not being a company that actually manufactures the boards, and solely on the grounds that you're assuming that if Nvidia was still making AGP boards anywhere near as high end as the Radeon HD 3850, that supposedly buying one would cause rainbows to form above you, and Skittles to reign from the sky, and the heavens to open up with jubilation over your choice to purchase from the almighty and diving Nvidia. In essence, you're comparing Ati to an idealized alternative that doesn't actually exist as a basis for criticism; this is called a Nirvana Fallacy. Really, you can go look it up.
As for the supposedly "typical" nature of your encounter, I've been living in Charlotte, NC, if you must know, in a house with its share of computers, many of which have seen Ati GPUs at some point or another, and none of which have suffered from these problems that supposedly the whole world associates with Ati. In reality, it's merely Nvidia fanboys who bend over backwards to typify these sorts of problems that create such a perception, if any really exists, because for these past few years, Nvidia has had no fewer problems than Ati, something I say confidently as a frequent purchaser and owner of GPUs from both companies. Anecdotal evidence does not equate to general trends. I would actually generally characterize my personal experiences with both companies as very good, benefiting from frequent driver updates and a good overall progression of GPU technology over these past years. I was very happy with my Geforce 8800gts, and the person who inherited it for their machine is still very happy with it, and while it had far more driver problems than my Radeon HD 4870, those problems were still few and far between, and I was easily able to get support from either company (or from my particular card's distributor).
Catamount, you just fed a troll or some serious fanboy. Anyway, the fact is that atis drivers are as good as nvidias these days.