Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

graphics card question

huntersamhuntersam Member UncommonPosts: 210

i currently have a budget geforce 430 (My expensive one blew a fuse) i am looking at a ati 6950 as my replacement what kind of benifits would i get from it and is it a good card . i have been a nvidia person for a long time just like the look of it

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,483

    Yeah, a Radeon HD 6950 is a good card.  If you're not gaming at an unusually high monitor resolution, then 1 GB is plenty of video memory, so this will be a nice card for you:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814150523

    There is also a 2 GB version if you do use an unusually high resolution, which basically means 2560x1440 and up.

    If you'd rather not spend that much, then I'll mention this card as well:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814102932

    That will give maybe 75%-80% of the performance of a Radeon HD 6950, while being cheaper.

  • DurrayDurray Member UncommonPosts: 182

    I still am a sucker for nvidia, I am using the gtx 570 atm I have to say the new 500 series are great. Good performance, quiet and cool (temperature!). Depending on your budget get a 560 Ti/570/580. Personally I think nvidia had the edge over AMD atm. Though that is highly debatable, though the 6950 is still a good card.

  • QuizzicalQuizzical Member LegendaryPosts: 25,483

    Overall, AMD's lineup beats Nvidia's right now.  AMD wins handily in performance per watt all up and down the lineup, with only the exception that a couple of AMD crippled chip salvage part cards are only comparable to Nvidia's best.  AMD also wins handily in performance per mm^2, which means performance per cost of building the chips.  That's not something that you need to worry about directly, but cards that are more expensive to build tend to be more expensive to buy.

    There's not much point in getting a GeForce GTX 560 Ti right now.  Compared to that card, the Radeon HD 6950:

    1)  Typically performs better

    2)  Uses less power

    3)  Has a better feature set

    4)  Is cheaper

    They're mostly non-reference cards, so temperatures would depend on which particular card you get.  So between the two cards, AMD's wins at everything and Nvidia's at nothing.  That leaves not much point in a GeForce GTX 560 Ti.

    If you want to compare a GeForce GTX 570, the natural AMD competitor is the Radeon HD 6970.  There, AMD wins as above, except that the GeForce GTX 570 can make the argument that it's cheaper.  It performs quite a bit better than a Radeon HD 6950, too.  So between the GTX 570 and the Radeon HD 6970, either card can make sense depending on your budget.

    The GeForce GTX 580 is the fastest single GPU card on the market, and if you need that for some reason and aren't sensitive to price, then it makes sense.  But the market for "aren't sensitive to price" isn't very big, and most people shopping in that market can take multiple cards.  When a $500 card commonly loses to a pair of $140 cards in CrossFire:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814102932

    the $500 card isn't really that great of a value.  I'm often dubious of going with two lower end cards in CrossFire/SLI rather than one higher end card, as CrossFire and SLI impose other costs on the system.  But a $220 price gap on the cards themselves will more than cover the costs of having to get a power supply with more 6-pin PCI-E power connectors, a bigger case with more airflow, and a higher end motherboard with at worst x8/x8 PCI Express bandwidth.

  • RidelynnRidelynn Member EpicPosts: 7,383

    Right now, the only 500 series nVidia card that is a clear tie (and not even a winner) against the current 6000 series ATI card is the 570GTX. It is on par with both price and performance against the ATI 6970. nVidia does have better 3-D right now (AMD is relying on 3rd party technology that doesn't widely exist right now), and they do have PhysX (for all Both games that use it), but 6970 does have a couple of tricks up it's sleeve that nVidia can't match. Eyefinity support without needing dual cards, but mainly, PowerTune.

    PowerTune is a very nice power management feature that is only found in the 6900 series (at least for right now), and basically ensures that your GPU always runs as fast as it can, but won't burn itself up. Without a technology like PowerTune, you have to underclock the GPU for programs like Furmark so they won't burn out, but that also means they will be underclocked for your games.

    Without PowerTune, sure, nVidia's cards look like great overclockers. Because they are underclocked on purpose; to keep them from burning themselves out under high stress situations. With PowerTune, the GPU automatically downclocks itself if it sees it's pulling too much power, but otherwise, will run as fast as it's able to regardless of what program is running.

    The real kick in the pants: the 2G 6950, that is around $80 cheaper than either the 6970 or the 570, and equal or only a few dollars more than the 560, can usually be unlocked to perform identically to a 6970. There are risks, and no guarantees with that, but it has a very high percentage of success, and that is a performance boost and "bang for the buck" that can't be matched by anything else right now.

    nVidia makes good cards, and nothing from ATI can match the 580 for speed on a single GPU. It's just that right now, and with the exception of the 570, you pay about 15-20% more for the brand name than you would for near-identical performance on the ATI side of the house. That, and nVidia doesn't have anything that even comes close to matching PowerTune, which is probably the most significant GPU technology to come out in the last 5 years.

  • madeuxmadeux Member Posts: 1,786

    ATI= Crisper, Cleaner Graphics and video, and Less Power.

    Nvidia: Better overlcocking, PhysX Support.

    Honestly, PhysX support is the only reason I'd go with Nvidia right now. 

  • CadwalderCadwalder Member Posts: 20

    Like Ridelynn says, the 6950 is good value for money. Yes, it CAN be unlocked to perform similar to the 6970. It's a good card, not very noisy, doesn't run too hot and has good tesselation support. Somewhere down the road, you can add another one of these babies and still get above average performance out of it too. And +1 for PowerTune. =D

    @maduex PhysX? There are hardly any games that use PhysX. Nobody really cares about it anymore. For me, that would never be a reason to buy an nVidia card. I've been an nVidia fanboy until ATI rolled out the HD 5970. Despite the card being plagued with driver issues early on, it can still run any of the old or brand new games on max settings at 1920x1080 without much trouble on my secondary system. I have to go easy on the AA sometimes, though.

    I bought a HD 6990 and a GTX 590 a couple of weeks back. The 6990 was giving me better performance (thought it was quite loud at full load) Even though it was significantly quieter and somewhat lower in temperature, the 590 burnt out within days of using (at stock clocks!). For some reason, the temperatures are not even being recorded from the hottest parts of the card. nVidia rolled this one out in a rush and made a lot of mistakes. With this, nVidia has disappointed me for the third time in a row.

    I still look out for nVidia though. The majority (but not the best anymore) of cards I've owned and used are nVidia.

  • QuizzicalQuizzical Member LegendaryPosts: 25,483

    Originally posted by Cadwalder

    I bought a HD 6990 and a GTX 590 a couple of weeks back. The 6990 was giving me better performance (thought it was quite loud at full load) Even though it was significantly quieter and somewhat lower in temperature, the 590 burnt out within days of using (at stock clocks!). For some reason, the temperatures are not even being recorded from the hottest parts of the card. nVidia rolled this one out in a rush and made a lot of mistakes. With this, nVidia has disappointed me for the third time in a row.

    So now you appreciate:

    1)  Why PowerTune is a killer feature at the high end (though it doesn't matter much for $100 desktop cards), and

    2)  Why the 300 W cap of the PCI Express specification is still a good idea, at least for air-cooled cards.

    The temperature sensors are in the GPU chips, which are what puts out most of the heat.  The cooler is designed very much to pull heat off of the GPU chips, but doesn't do so well with the power circuitry.  The reference Radeon HD 5970 had a lesser version of the same problem, with insufficiently cooled VRMs.

    Part of the problem with the GeForce GTX 590 isn't merely that the power circuitry isn't cooled well enough.  It's that there isn't enough of it there for how much power the card can draw.  If a GeForce GTX 590 could never pull more than the 365 W that Nvidia claims is the TDP, then both the cooler and the power circuitry would be more or less adequate.  Not great, but adequate.  Try to do 450 W and things go boom.

  • RidelynnRidelynn Member EpicPosts: 7,383

    I was going to wonder how long it would be before we saw integrated water cooling units for GPUs come along (similar to the Corsair H50/60/70, or the Kuhler H20), but then I remembered... these units are only desgined for about 150-175W max and are already pushing a 120MM fan/radiator. To get up past 300W, your looking at a significant increase in radiator surface area - either it would be so dense that it would impede against most CPU heatsinks, or it would be so large it wouldn't fit in a 120mm fan slot (like dual/triple 120mm radiators that are meant to mount in the top of ATX cases).

    It could be done with an external cooling unit (like Koolance Exos), then it could be however big you needed it to be, but then you have something hanging out case.

    I think, unless we see a leap in cooling technology (which is possible, there was some talk about carbon nanotube cooling, and another for "self-cooling" semiconductors that were fairly exciting), we may just be stuck around 300-400W in that form factor. Cards could start thinking about going triple slot, but that gets structurally hard to do. Or we could change the PCI location, such that it's mounted differently - maybe make the GPU's more modular (like a power supply box), that gives them more room for cooling solutions and to spread out the power circuitry, and then run a PCI extender plug into that. It would be a totally different form factor from ATX though.

Sign In or Register to comment.