It looks like you're new here. If you want to get involved, click one of these buttons!
http://mmoculture.blogspot.com/2010/03/nvidia-gtx-480-1st-official-video.html
lol!they finally caught ati 58xx serie lol
i bet ati already as a driver in their pocket ready for such eventuality there is some much headroom in the ati 5xxxx serie
im only scared about ati sleeping on the driver end of thing because the hardware no doubt ati 5xxx just need new driver to pull ahead!
nice to see nvidia finally releasing something soon tho!it will keep the battle livelly(at least till intel decide to finally make their mind and chose a technology to compete against ati and nvidia.
Comments
Saw this a few days ago, I don't fully trust the charts as they are from nvidia.
Saying that, I have some money ready for a 480 if it is any good, or failing that ATI's best. All comes down to whether I can find either in stock at launch.
The results shouldn't have been advertised unless it performed 50% above the performance of the HD5870. It uses 50% more transistors and is considerably hotter and consumes more energy. These results look to have a 20% performance increase average which is not what you want when you use so many more transistors.
I was thinking the same thing Cleffy. Thats not enough oomph for the extra juice it sucks up. Maybe its just to early in its cycle of development to judge though. Personally I'll stick with my 4870 until the next generation of 5xxx Series is released.
What??????
The juice , in that case you better have a under clocked and voltaged mobile processor becuase you wouldn't want to use too much juice.
Sorry but this post makes no logical sense but whatever.
Now back to the 480, that test was bull. Why becuase it was not fps vs fps it was tessellation vs tessellation.
Basically, stuff that will not matter for years to come, more so seeing all tens reviews result in around 50% fps rate. Cheer, sounds wonderful.
Anyway we will just have to see how this works out.
One thing a sliding tessellation allows is games to have a high depths of geometry for high end system and lower geometry for lower end computers. (very useful in multiplayer games).
ATM it looks like the 480 is a a super crammed 280.
"Society in every state is a blessing, but government even in its best state is but a necessary evil; in its worst state an intolerable one ..." - Thomas Paine
Sad. I was an nvidia guy forever, then the 5850 was out and I couldnt say no. 6 months later here we are, and nvidia is still not able to compete. Craziness.
always used nvidia, currently using the 260 gt, which is a decent enough card, only time i ever tried to use an ATI i couldnt get it to work, was a while ago so perhaps ATI doesnt have the driver problems it used to, but for me, it pretty much decided me against trying to use them ever again - once directx 9c is finally replaced with directx 11 i'll probably upgrade to a newer card again - as its looking increasingly like directx 10 is going to be bypassed entirely.
nha ati still lack in one departement i own an ati
its the scan feature ,im sorry but nvidia run smoother for the average consumer for one reason
you ask nvidia to scan your system !(if baught a nvidia chipset card your covered)
it detect your card then show you latest feature avail for your card
ati lack that !ati need to have a scan or at least more descriptive thing the one they have right now suck
and ati need to do a lot more polishing on their driver
aside from that im perfectly happy true ati hasnt felt any pressure since the last 9 month and with 9 month old gear they just need to polish their driver and they ll easilly beat nvidia
and it made me laugh when nvidia used tesselation to compare their card why didnt they use the same program everybody else use .i understand their view and i agree ati 5xxx serie and nvidia 4xx serie cant be compared for shear power alone since their are made for parallel computing so the actual test most reviewer have been using for the las 5 years
dont show the true picture.at least nvidia should have sent the card to an independant reviewer .
and what is going on with intel they get out a 6 core processor,this means they are working on a cpu size graphic thing they will work beside the cpu probably on a very close to same number of pin.
men no news from intel rarelly is a good thing for other companys
since we all know they are already on the 32 nm proc for processor and 25 nm for memory
this means before the years end we ll probably get intel gamer edition of graphic solution (witch intel as never even went in that market before)so it will be interesting 9 next month lol.
I disagree on drivers. For the last year atleast, ATI has had the better drivers. In my opinion their drivers have been better since the AMD merger and only continues to get stronger. I don't think Intel will ever truly be a challenge in GPU for discrete manufacturers. They just don't view the GPU as a GPU and this leads to why their GPU designs are always slower then nVidia or ATI. Every year someone says Intel is going to enter the high end GPU market, yet they still haven't surpassed GPUs made 7 years ago. I don't think we will have a 3rd player in discrete GPUs until Matrox gets a partner as they are the only ones close to ATI and nVidia.
I don't think we will see a good card out of nVidia until Q4 2010. I think the GTX 480 is going to be released in a similar fashion as the HD2900 a few years ago. Also we should be seeing independant benchmarks from review sites in the last week of March. BTW GPUs are now marching towards 28nm. Shouldn't forget GPUs are at half-node.
mm arent they at 40 nm right now ,my ati 5770 is 40 nm
so from 40 to 28 is a very big jump even if ati is helped by amd on the size front!
the issue with intel they dont go at it with a card base like ati and nvidia have been doing
they go at it with a cpu chip form factor so their challenge is way bigger then ati
on the integrated market they ve been leading for a while now and dont forget i3 and i5 is their in between teck since they get smaller every year but change the way they make processor every 2 year.i was surprised they didnt put a 32 nm graphic on the i3 and i5
but in the end i think intel will go this route:
1:they ll put 8 gig of dram on proc instead! their new 25 nm size for memory is small enough to put all ram on the processor
will probably be all diff memory too,dram or faster
2:intel is working hard to do a cpu sized graphic solution that is as powerfull or equivalent to most powerfull
but intel isnt in a hurry to jump in ,hell i dont blame them they make some of the fastest proc in the world
and they have the biggest chunk of the market,they have direct cache access etc true ibm help amd but its very limited
intel they make the motherboard they make the ,cpu,the network card,the ram,the ssd,etc etc etc
so you end up with a better and more stable package from intel on the long run
When talking about size, they work on nodes. What a node represents is the size a transistor needs to be in order to double the amount of transistors in the same area. CPU manufacturers typically work at Full Node (90nm, 60nm, 45nm, 32nm, 22nm...). Chips that don't need to same sophistication as a cpu typically work at a half node (80nm, 55nm, 40nm, 28nm,,,)
From my personal view, ati's drivers have been better than nvidias for some while already.
I dunno guys im not taking sides or nothing but as a really limited budget gamer i stick by my GTS 250 E-Green and it's 3D Vision. Also in my personal opinion, that 3D Vision can be quite sweet on some high end games like Avatar, Batman etc. but i would like to seemore MMOs support it like WoW.
On top of that i would like to add that the GF100 GPUs are going to be one up, since they are going to have outstanding frames but you get the technoligies like PhysX, CUDA and ofcorse the 3D Vision on top so in my opinion the powerfull ATI chewtoys dont stand a living chance no matter the model since they don't use the power and are basically a waste.... its like the quantity and quality thing all over again.
Limited budget and 3d vision don't really add up.
Why would closed standard be better than open? Ati supports among other things bullet, pixelux and havok via opencl. Physx is still next to useless, and with no 3rd party support it will die soon enough.
Limited budget and 3d vision don't really add up.
Why would closed standard be better than open? Ati supports among other things bullet, pixelux and havok via opencl. Physx is still next to useless, and with no 3rd party support it will die soon enough.
I found this article pretty interesting:
http://physxinfo.com/articles/?page_id=154
-----------------------------------------------------------------------------------------
4. Conclusion
Today situation on physics engines market is clear enough, but we’ll return to this topic at the end of 2010 with great pleasure – because 2010 is going to be reach on interesting events. Havok will try to get hold of indie developers again – this time with Independent Developer Program In addition, Havok is heading omni-solution way, as it has now brand-new tools – AI, Cloth and Destruction – in product line. What’s next – Havok Sound ? Havok Network ?
PhysX SDK, in turn, will endeavour to conquer console market with long-awaited SDK 3.0 (reworked architecture, new features, extensive optimizations for Xbox 360 and PS3) and APEX toolset.
Another topic is GPU accelerated physics, what’s going to prevail – industry “standarts” like OpenCL Bullet (without GPU physics based tools and experienced developers, but working on wide set of hardware) or proprietary developements, like PhysX (with direct support from Nvidia, GPU physics oriented architecture and complete toolset, team of content-developers) ?
"If you want a picture of the future, imagine a robot foot stomping on a human face -- forever."
I have an ATI 5870 and the drivers were so terrible I pulled the card out of my rig and put my GTX 280 back. Granted I bought the card literally on the day of its launch (didn't have a choice, my GTX 280 died, the RMA process was 2 weeks long and I needed a vid card NOW).
I'll probably put my ATI card back in when I head back to AoC when the expansion launches.
Otherwise my GTX280 still runs just about every game at max settings 1900x1200 so there hasn't been a need for me to use the buggier ATI card.
Alltern8 Blog | Star Wars Space Combat and The Old Republic | Cryptic Studios - A Pre Post-Mortem | Klingon Preview, STO's Monster Play
Your a complete dolt if you buy the GTX 480. Why?? You ask. I will tell you why have we all ready R.I.P the GTX 8800? The standard in gaming for the longest time. Yes I admit I did up grade my GTS yes GTS 8800 for a GTX 260 only because I felt I needed DX 10 or for that matter and It has been pointed out to me that even though Win 7 says im running at DX 11 it is not true I verified this on BFG site. Sorry no links needed just take my word the GTX 260 is only a DX 10 or for that matter 10.1 at best as if the .1 really matters. Anyways back to subject at hand the GTX 8800 will still run most MMO's on high end settings lets face it MMOS don't run at the highest of graphics anyways. with a few exceptions like Vangaurd. Now if you are into FPS then by all means you need a good DX 10 card. But even Assassin's Creed II Recommended System Requirements
* Processor: Intel Core® 2 Duo E6700 2.6 GHz or AMD Athlon 64 X2 6000+ or better
* Video Card**: GeForce 8800 GT or ATI Radeon HD 4700 or better.....
Or Better another words if you have a better card good for you now you can set this game on the highest settings basically from high to ultra high. lol cool.
This game is one of the highest end games out there to date. So basically what is the point in all this. Simple don't wast your money on a Card that 99.9 % of gaming companies don't even support yet.
Limited budget and 3d vision don't really add up.
Why would closed standard be better than open? Ati supports among other things bullet, pixelux and havok via opencl. Physx is still next to useless, and with no 3rd party support it will die soon enough.
I found this article pretty interesting:
http://physxinfo.com/articles/?page_id=154
-----------------------------------------------------------------------------------------
4. Conclusion
Today situation on physics engines market is clear enough, but we’ll return to this topic at the end of 2010 with great pleasure – because 2010 is going to be reach on interesting events. Havok will try to get hold of indie developers again – this time with Independent Developer Program In addition, Havok is heading omni-solution way, as it has now brand-new tools – AI, Cloth and Destruction – in product line. What’s next – Havok Sound ? Havok Network ?
PhysX SDK, in turn, will endeavour to conquer console market with long-awaited SDK 3.0 (reworked architecture, new features, extensive optimizations for Xbox 360 and PS3) and APEX toolset.
Another topic is GPU accelerated physics, what’s going to prevail – industry “standarts” like OpenCL Bullet (without GPU physics based tools and experienced developers, but working on wide set of hardware) or proprietary developements, like PhysX (with direct support from Nvidia, GPU physics oriented architecture and complete toolset, team of content-developers) ?
Even the address of that site tells how neutral opinion the above is.
ya its the main issue with ati!they dont have a scan feature witch make it a lot of guess work to update
nvidia just trigger their scan button (in window explorer)and let it do
in ati you got to slect your os the select the software you need but since your not computer letterate you end up chosing the wrong one (like i did)then you have the restart the process etc etc etc
ati update driver feature on their site is a nighmare!
i have been using nvidia beta scan and i can tell you that feature alone undo all the bad that can happen
ati will have to work a bit less at they hardware and a whole lot more at the driver and software front
maybe if ati(amd) call intel for a bit of help intel will try to show ati how !
you dont have to be a pc expert to know what graphics card you have.. besides you can check your pc manual..and besides that when you find out wich card do you have you can remember that for the next time..write it somewhere..its not like you have to do that everytime you want to upgrade your drivers...
as for what Operating System you got is more than obvious everytime you are switch your pc on.
Next time if you get a new card you will remember the model so no problem.
I prefer Ati to Give more money to reaserch for new technologies and driver optimizing rather than make scan and how too for the nubs.
ya its the main issue with ati!they dont have a scan feature witch make it a lot of guess work to update
nvidia just trigger their scan button (in window explorer)and let it do
in ati you got to slect your os the select the software you need but since your not computer letterate you end up chosing the wrong one (like i did)then you have the restart the process etc etc etc
ati update driver feature on their site is a nighmare!
i have been using nvidia beta scan and i can tell you that feature alone undo all the bad that can happen
ati will have to work a bit less at they hardware and a whole lot more at the driver and software front
maybe if ati(amd) call intel for a bit of help intel will try to show ati how !
That is so full of crap that can possibly be. Don't blame others if you are the one who clicks wrong option.s
Limited budget and 3d vision don't really add up.
Why would closed standard be better than open? Ati supports among other things bullet, pixelux and havok via opencl. Physx is still next to useless, and with no 3rd party support it will die soon enough.
I found this article pretty interesting:
http://physxinfo.com/articles/?page_id=154
-----------------------------------------------------------------------------------------
4. Conclusion
Today situation on physics engines market is clear enough, but we’ll return to this topic at the end of 2010 with great pleasure – because 2010 is going to be reach on interesting events. Havok will try to get hold of indie developers again – this time with Independent Developer Program In addition, Havok is heading omni-solution way, as it has now brand-new tools – AI, Cloth and Destruction – in product line. What’s next – Havok Sound ? Havok Network ?
PhysX SDK, in turn, will endeavour to conquer console market with long-awaited SDK 3.0 (reworked architecture, new features, extensive optimizations for Xbox 360 and PS3) and APEX toolset.
Another topic is GPU accelerated physics, what’s going to prevail – industry “standarts” like OpenCL Bullet (without GPU physics based tools and experienced developers, but working on wide set of hardware) or proprietary developements, like PhysX (with direct support from Nvidia, GPU physics oriented architecture and complete toolset, team of content-developers) ?
Even the address of that site tells how neutral opinion the above is.
I dont know, the article was extremely non-biased. did you find it otherwise? I actually found it more leaning toward havoc then physx. It didnt really discuss opinions at all. Just facts how both of the two API's progressed mostly in 2009.
Is it the fact that games like Mass Effect, Gears of War and Dragon Age were mentioned using physx and nothing was really mentioned using havok? Or that theyre targetting consoles with physx sdk 3.0 to work with the 360 and the ps3?
sorry I brought it up...you're right.. physx is a useless api and is short lived.. /sarcasm
*edit* not to say havoc doesnt have some awesome titles, in fact that article pretty much made it clear that havok has the highest rated games above physx. I guess my point was, physx isnt useless, and its gaining a lot of momentum especially in the past year.
"If you want a picture of the future, imagine a robot foot stomping on a human face -- forever."
Limited budget and 3d vision don't really add up.
Why would closed standard be better than open? Ati supports among other things bullet, pixelux and havok via opencl. Physx is still next to useless, and with no 3rd party support it will die soon enough.
I found this article pretty interesting:
http://physxinfo.com/articles/?page_id=154
-----------------------------------------------------------------------------------------
4. Conclusion
Today situation on physics engines market is clear enough, but we’ll return to this topic at the end of 2010 with great pleasure – because 2010 is going to be reach on interesting events. Havok will try to get hold of indie developers again – this time with Independent Developer Program In addition, Havok is heading omni-solution way, as it has now brand-new tools – AI, Cloth and Destruction – in product line. What’s next – Havok Sound ? Havok Network ?
PhysX SDK, in turn, will endeavour to conquer console market with long-awaited SDK 3.0 (reworked architecture, new features, extensive optimizations for Xbox 360 and PS3) and APEX toolset.
Another topic is GPU accelerated physics, what’s going to prevail – industry “standarts” like OpenCL Bullet (without GPU physics based tools and experienced developers, but working on wide set of hardware) or proprietary developements, like PhysX (with direct support from Nvidia, GPU physics oriented architecture and complete toolset, team of content-developers) ?
Even the address of that site tells how neutral opinion the above is.
I dont know, the article was extremely non-biased. did you find it otherwise? I actually found it more leaning toward havoc then physx. It didnt really discuss opinions at all. Just facts how both of the two API's progressed mostly in 2009.
Is it the fact that games like Mass Effect, Gears of War and Dragon Age were mentioned using physx and nothing was really mentioned using havok? Or that theyre targetting consoles with physx sdk 3.0 to work with the 360 and the ps3?
sorry I brought it up...you're right.. physx is a useless api and is short lived.. /sarcasm
*edit* not to say havoc doesnt have some awesome titles, in fact that article pretty much made it clear that havok has the highest rated games above physx. I guess my point was, physx isnt useless, and its gaining a lot of momentum especially in the past year.
What I meant is that physx usage in games has only been single flags and stuff, aka very meaningless when thinking about whole game. NV has actually been paying the game companies so they'd put physx in their game, via TWIMTBP.
One good example of this is Batman Arkham asylym, which is a TWIMTBP game, even AA doesn't work in it with ati cards.
Physx could have possibilities, but as nv decides to keep it closed it will just become the next glide and die away soon enough.