Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

GTX 1080 is here

1246

Comments

  • Doug_BDoug_B Member UncommonPosts: 153
    edited May 2016
    Hrimnir said:
    Anandtech has been a very pro nvidia /intel website for a long time. If you want a website, try guru3d.com though even he is pro nvidia but his tests are decent.
    Bachelor's in Web Design and Multimedia
  • HrimnirHrimnir Member RarePosts: 2,415
    Doug_B said:
    Hrimnir said:
    Anandtech has been a very pro nvidia /intel website for a long time. If you want a website, try guru3d.com though even he is pro nvidia but his tests are decent.
    The fact that you actually believe that shows how clueless you are.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited May 2016
    Some 1080 Dx12 leaked tests (Fury X test on same game version in below post):



    Post edited by Malabooga on
  • zaberfangxzaberfangx Member UncommonPosts: 1,796
    Malabooga said:
    Some 1080 Dx12 leaked tests (Fury X test on same game version in below post):



    Only problem I have with this test, CPU abit off.
  • KyutaSyukoKyutaSyuko Member UncommonPosts: 288
    Malabooga said:
    Some 1080 Dx12 leaked tests (Fury X test on same game version in below post):



    Not only is the 1080's CPU clocked at 3.5 vs the 3.0 of the R9 Fury the settings for the 1080 are High vs Ultra and the Fury X still has a better average frame rate...
  • xyzercrimexyzercrime Member RarePosts: 878
    Malabooga said:
    Some 1080 Dx12 leaked tests (Fury X test on same game version in below post):



    Only problem I have with this test, CPU abit off.
    Yeah, somehow CPU clock is lower on AMD side.



    When you don't want the truth, you will make up your own truth.
  • gervaise1gervaise1 Member EpicPosts: 6,919
    edited May 2016
    My problem is how the linked site has "one test" score at all when long established sites haven't yet got cards to do any tests at all. 

    Oh and how news outlets have missed these tests results ....
  • PhryPhry Member LegendaryPosts: 11,004
    Malabooga said:
    Some 1080 Dx12 leaked tests (Fury X test on same game version in below post):



    Only problem I have with this test, CPU abit off.
    Yeah, somehow CPU clock is lower on AMD side.
    Those are two very different processors tbh, and, honestly, i am not sure the clock rate is as important as the other factors, the two examples given could be interpretted to be somewhat biased, something which would not be a factor if the two systems were identical in all regards except the GPU's in question, it renders the comparisons meaningless if they are not. :o
  • 13lake13lake Member UncommonPosts: 719
    edited May 2016
    The kicker seems to be that the Founder's Edition is not higher binned, nor higher clocked, it's just the reference version, and its not gonna be just $100 more expensive from Nvidia website, it's gonna be $700 whether it's the Zotac reference or Inno3D or Evga etc, ...

    Custom cooler ones from AIB are gonna be conveniently late and/or perpetually out of stock (surely not because of lack of GDDR5X chips :P)

    Multi-staggered and delayed paper release, with the pre-orders being $100 more expensive and reference cooler limited. If anyone was wondering how nvidia was gonna actually launch this before fall, this is the answer :)

    And if people thought the late custom cooler 290/x were taking too long back in the day, well they're in for a record breaker now :)
  • MalaboogaMalabooga Member UncommonPosts: 2,977
  • SEANMCADSEANMCAD Member EpicPosts: 16,775
    Malabooga said:
    it really doesnt take a rocket scienist to figure it out.

    just look up past gen releases on wikipedia, extrapolate...done

    Please do not respond to me, even if I ask you a question, its rhetorical.

    Please do not respond to me

  • Doug_BDoug_B Member UncommonPosts: 153
    LOL



    Bachelor's in Web Design and Multimedia
  • RidelynnRidelynn Member EpicPosts: 7,383
    Torval said:
    13lake said:
    The kicker seems to be that the Founder's Edition is not higher binned, nor higher clocked, it's just the reference version, and its not gonna be just $100 more expensive from Nvidia website, it's gonna be $700 whether it's the Zotac reference or Inno3D or Evga etc, ...

    Custom cooler ones from AIB are gonna be conveniently late and/or perpetually out of stock (surely not because of lack of GDDR5X chips :P)

    Multi-staggered and delayed paper release, with the pre-orders being $100 more expensive and reference cooler limited. If anyone was wondering how nvidia was gonna actually launch this before fall, this is the answer :)

    And if people thought the late custom cooler 290/x were taking too long back in the day, well they're in for a record breaker now :)
    I'm curious what the exact differences in the editions will be as well. What will that $440/$700 price tag buy?
    Apparently, it buys you a spot in line to get the card. Otherwise, you have to wait for general availability for the aftermarket editions. That's my guess - I may be wrong, we'll see come May 27.

    I kinda agree with 13lake, this seems to be how they will control an otherwise limited release. Those who are wiling to pay for it go to the front of the line. 

    That being said, I read on another site (or maybe it was linked here, I apologize, I get confused so easily) that Founders was heavily lobbied by boutique manufacturers and integrators (Falcon Northwest, Alienware, etc) - the SKU would stay in production for the life of the chip, it would have a decently capable cooler with external exhaust, and that would allow them to validate the card up front, and continue to use that same SKU in all their builds without needing revalidation if the cooler changed, or the clock speed were adjusted, or whatever.

    I doubt the 3rd party integrators also lobbied for the $100 price increase as well. But they probably get some "corporate wholesale discount" where that gets negated for them anyway.
  • HrimnirHrimnir Member RarePosts: 2,415

    I don't know why everyone is so up in arms over this.  Manufacturers in all aspects of life have been doing this for ages.  They're just now being transparent about it, whereas instead of NVidia doing it, the retailers did it.  Newegg was selling 980ti's for ridiculously higher than MSRP, same with i7-6700k's, etc.  If you want to be the first kid on the block with the new toy (whether that's the new Porsche 911, Audi R8, Intel processor, Omega watch, etc) you have to be prepared to pay through the nose for it.

    The simple fact is there IS a market for "overpriced" products being sold early.  I put overpriced in quotes because its ridiculous, something is worth what someone will pay for it.  For example most of us would scoff at paying a few dollars for a gallon of water, but in a drought where the grocery store shelves are sold out of water and have been for a week, you'll readily pay $40 for a gallon of water.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • Doug_BDoug_B Member UncommonPosts: 153
    Hrimnir said:

    .  For example most of us would scoff at paying a few dollars for a gallon of water, but in a drought where the grocery store shelves are sold out of water and have been for a week, you'll readily pay $40 for a gallon of water.

    If water is at that level of scarcity, think most people would start digging until they found water.
    Bachelor's in Web Design and Multimedia
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Hrimnir said:

    I'm order some crow to cook up for people to eat here in a month or two.

    I mean seriously the amount of fanboying on both sides of this issue in this thread is facepalm worthy.

    Quiz is still trying to claim you won't see a 1080 in even july, and that Polaris will be available in july, even though we've seen barely any info.

    So basically TL:DR is "Nvidia is liars and they're going to epically screw up, and AMD are perfection and will magically release a card that we have almost no info on BEFORE NVidia releases their new cards"

    I think I'm going to trust a reputable publication like Anandtech over a bunch of fanboys:

    http://www.anandtech.com/show/10304/nvidia-announces-the-geforce-gtx-1080-1070

    The first two cards out of the gate will be NVIDIA’s high-end cards, the GTX 1080 and GTX 1070. 1080p jokes aside, these are straightforward successors to what has now been NVIDIA’s product stack naming scheme since 2010, with the GTX 1080 representing their new high-end card while the GTX 1070 is a cheaper card meant to hit the enthusiast sweet spot. These cards will be launching over the next month, with GTX 1080 hitting the market May 27th, and GTX 1070 two weeks later, on June 10th. They will be priced at $599 and $379 respectively.

    I could believe that a few GTX 1080s will trickle out slowly over the course of the next few months.  But if you want a more quantified claim, I'm quite certain that there won't be a week in July where New Egg has a GTX 1080 in stock and under $650 for the entire week.  Replace $650 by $400 and there's a decent chance that the same will happen to the GTX 1070--as well as a decent chance that it will have widespread availability at or below MSRP by the end of July.

    I'm not claiming that Polaris will definitely be available in July.  I'd be very surprised if there aren't any reviews out on any Polaris chips by the end of July, and mildly surprised if there aren't any Polaris cards at all readily available for anyone who wants to buy one.  I won't be surprised at all if at the end of July, desktop Polaris is available, but prices are high enough to encourage people to buy a previous generation card instead.  That's happened to quite a few non-top end chips on new process nodes in the past, most notably Pitcairn and Juniper.

    The other big unknown on Polaris is clock speeds.  Nvidia is claiming that Pascal can get massively higher clock speeds than previous generation GPUs.  Will Polaris do the same, simply because that's how the newer process nodes work?  Likely, but we don't know--and AMD is using a different process node from Nvidia, so there could be huge differences just from the process node.

    Which card is widely available first isn't about which vendor decides to announce information first.  Rather, it's almost entirely about which chip had a big production order of wafers first, especially if that's a gap of months, unless a fab screws up and can't deliver the order that they were supposed to.  And to be fair, Global Foundries is more likely to make a colossal screw-up like that than TSMC.  But even after the fact, vendors don't commonly announce when they started mass production of particular discrete GPU chips.
  • HrimnirHrimnir Member RarePosts: 2,415
    edited May 2016

    @Quizzical

    That's precisely my point.  We're making wild guesses about how much availability they will have.  The difference here is that it's not like 16/14nm processes are literally brand new. They're new for GPU's, but not for CPU's, and some of that knowledge and teething issues being resolved does transfer over, not all of it, but some of it.

    I don't think anyone here was ever going to try to claim that the masses could go buy GTX 1080's the day of release without wait or paying a premium. In my memory of buying hardware I can't think of a single time where I low yield high end part like that was readily available in the first month or more of release. Again if you take recent examples of the 6700k, which was in full blown production for months, and still selling over MSRP because demand was THAT high, you can see it's not ALL about production.

    Yes, the first 4-6weeks, maybe even 4-8 weeks its mostly about yields and such, however my main point is that is not different from any other high end product in the history of computers, so frankly it's pointless to argue that as some kind of negative to NVidia, because they and AMD have been doing exactly that since time started.

    I want to apologize I've been railing on you pretty hard, I got a little caught up in some of the other fanboys posts being absurd and kind of lumped you in, though in all fairness you were making some pretty wild claims as far as time frames as little as a month ago, so it wasn't totally baseless on my part.  However, I still feel I owe you an apology and I've been harsher in my disagreements than was necessary.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • MalaboogaMalabooga Member UncommonPosts: 2,977
    edited May 2016
    Hrimnir said:

    @Quizzical

    That's precisely my point.  We're making wild guesses about how much availability they will have.  The difference here is that it's not like 16/14nm processes are literally brand new. They're new for GPU's, but not for CPU's, and some of that knowledge and teething issues being resolved does transfer over, not all of it, but some of it.

    I don't think anyone here was ever going to try to claim that the masses could go buy GTX 1080's the day of release without wait or paying a premium. In my memory of buying hardware I can't think of a single time where I low yield high end part like that was readily available in the first month or more of release. Again if you take recent examples of the 6700k, which was in full blown production for months, and still selling over MSRP because demand was THAT high, you can see it's not ALL about production.

    Yes, the first 4-6weeks, maybe even 4-8 weeks its mostly about yields and such, however my main point is that is not different from any other high end product in the history of computers, so frankly it's pointless to argue that as some kind of negative to NVidia, because they and AMD have been doing exactly that since time started.

    I want to apologize I've been railing on you pretty hard, I got a little caught up in some of the other fanboys posts being absurd and kind of lumped you in, though in all fairness you were making some pretty wild claims as far as time frames as little as a month ago, so it wasn't totally baseless on my part.  However, I still feel I owe you an apology and I've been harsher in my disagreements than was necessary.

    The demand for Skylake was very low, thats why it stayed at high price. It has received very bad reviews and people went for 20% cheaper Haswell (that also didnt need expecive new DDR4 RAM). Later SkyOC killed -k sales.

    There has NEVER EVER been Skylake chip shortage. In fact market is still flooded with unsold Haswell chips (and now Skalyke chips and new gen Kaby is on the way (and its just another marginal "improvement"). I highly doubt Intel wants 3 gens of CPUs in the market at the same time. Sellers also dont wan it because its 3 platforms and they still cant get rid of Haswell. Storage costs money. A lot of money.
  • YashaXYashaX Member EpicPosts: 3,100
    Darn it, I was trying to hold out for the new cards but I had to get a new computer recently with a GTx980ti because my last computer kicked the bucket. I thought the new tech wouldn't be out at least until the end of the year.
    ....
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    edited May 2016
    Hrimnir said:

    @Quizzical

    That's precisely my point.  We're making wild guesses about how much availability they will have.  The difference here is that it's not like 16/14nm processes are literally brand new. They're new for GPU's, but not for CPU's, and some of that knowledge and teething issues being resolved does transfer over, not all of it, but some of it.

    I don't think anyone here was ever going to try to claim that the masses could go buy GTX 1080's the day of release without wait or paying a premium. In my memory of buying hardware I can't think of a single time where I low yield high end part like that was readily available in the first month or more of release. Again if you take recent examples of the 6700k, which was in full blown production for months, and still selling over MSRP because demand was THAT high, you can see it's not ALL about production.

    Yes, the first 4-6weeks, maybe even 4-8 weeks its mostly about yields and such, however my main point is that is not different from any other high end product in the history of computers, so frankly it's pointless to argue that as some kind of negative to NVidia, because they and AMD have been doing exactly that since time started.

    I want to apologize I've been railing on you pretty hard, I got a little caught up in some of the other fanboys posts being absurd and kind of lumped you in, though in all fairness you were making some pretty wild claims as far as time frames as little as a month ago, so it wasn't totally baseless on my part.  However, I still feel I owe you an apology and I've been harsher in my disagreements than was necessary.

    You focus on the Core i7 6700K, but that's a great outlier as Intel desktop CPUs go.  I can't think of any other that notably hard to get immediately after launch.  AMD sometimes has softer launches on their CPUs, but usually not Intel.

    With Polaris, the GTX 1070, and any lower end, unannounced Pascal chips that may or may not exist, yeah, we're guessing when they'll really be available.  But with the GTX 1080, we don't know when it will be available, but we know that soon isn't possible.  Since you like Anandtech:

    http://www.anandtech.com/show/10193/micron-begins-to-sample-gddr5x-memory

    "Micron originally promised to start sampling of its GDDR5X with customers in Q1 and the company has formally delivered on its promise. What now remains to be seen is when designers of GPUs plan to roll-out their GDDR5X supporting processors. Micron claims that it is set to start mass production of the new memory this summer, which hopefully means we're going to be seeing graphics cards featuring GDDR5X before the end of the year."

    If one person says October and another says December for wide availability, then yeah, they're guessing.  But someone who says June is flatly wrong.  Similarly, if one person says the Warriors are going to win the NBA Finals this year, and another person says the Cavs will, they're guessing.  But someone who says the Clippers are going to win it all is simply wrong.

    Also, apology accepted.  But we can still disagree.  :)
  • HrimnirHrimnir Member RarePosts: 2,415
    I appreciate the link.  I generally do trust anandtech.  It just seems extremely odd that Nvidia would announce a hard launch THAT far in advance of GDDR5X being readily available... it would just be the height of stupidity. I mean, admittedly they've done some pretty stupid shit in the past, but you generally don't get to be a leader in your field by constantly making horrible business decisions.

    This is pure speculation at this point but i still maintain that even october is really out there.  I know this is a bit older article, but at least as of feb they had first sample runs of GDDR5X that was apparently clocking very well, so all indications at that point were that it would be in "mass production" by "summer", and that yields wouldn't be an issue.

    So, what we have to speculate is what they mean by summer, if we use the calendar summer technically starts June 20th.  So, i mean, i would say july-august is a reasonable assumption at this point for "mass availability".

    http://www.bit-tech.net/news/hardware/2016/02/11/micron-gddr5x/1
    Published on 11th February 2016 by Gareth Halfacree

    Memory giant Micron has announced that it is on-track to begin mass production of GDDR5X memory modules by the summer, having completed small-run test manufacturing of its first-generation 8Gb modules.
    Micron's Graphics DRAM Design Centre in Munich has confirmed these speeds as entirely achievable, boasting of hitting 13Gb/s in its first production run of 20nm 8Gb (1GB) GDDR5X modules.

    I agree with you that at this point the limiting factor is going to be the GDDR5X memory.  The actual GPU itself while on a new process for GPUs is not a new process for CPU's, as i said before a lot of that knowledge absolutely transfers over, so, i suspect nvidia isn't going to have huge issues with yields at the time they're able to get large quantities of gddr5x in their hands.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • HrimnirHrimnir Member RarePosts: 2,415
    YashaX said:
    Darn it, I was trying to hold out for the new cards but I had to get a new computer recently with a GTx980ti because my last computer kicked the bucket. I thought the new tech wouldn't be out at least until the end of the year.
    My roommate did the same thing, just bought a 980ti 2 weeks ago. I kept telling him that at the *insert expletive* latest it would be august/september, but he felt that it was worth it to have the faster card now.  Admittedly he was upgrading from a 760 so that's a hellofa upgrade.

    "The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently."

    - Friedrich Nietzsche

  • SEANMCADSEANMCAD Member EpicPosts: 16,775
    Hrimnir said:
    YashaX said:
    Darn it, I was trying to hold out for the new cards but I had to get a new computer recently with a GTx980ti because my last computer kicked the bucket. I thought the new tech wouldn't be out at least until the end of the year.
    My roommate did the same thing, just bought a 980ti 2 weeks ago. I kept telling him that at the *insert expletive* latest it would be august/september, but he felt that it was worth it to have the faster card now.  Admittedly he was upgrading from a 760 so that's a hellofa upgrade.
    I actually want my Rift order to be delayed for this reason. If not no biggy I will just get a new machine with the 970

    Please do not respond to me, even if I ask you a question, its rhetorical.

    Please do not respond to me

  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Hrimnir said:
    I appreciate the link.  I generally do trust anandtech.  It just seems extremely odd that Nvidia would announce a hard launch THAT far in advance of GDDR5X being readily available... it would just be the height of stupidity. I mean, admittedly they've done some pretty stupid shit in the past, but you generally don't get to be a leader in your field by constantly making horrible business decisions.

    This is pure speculation at this point but i still maintain that even october is really out there.  I know this is a bit older article, but at least as of feb they had first sample runs of GDDR5X that was apparently clocking very well, so all indications at that point were that it would be in "mass production" by "summer", and that yields wouldn't be an issue.

    So, what we have to speculate is what they mean by summer, if we use the calendar summer technically starts June 20th.  So, i mean, i would say july-august is a reasonable assumption at this point for "mass availability".

    http://www.bit-tech.net/news/hardware/2016/02/11/micron-gddr5x/1
    Published on 11th February 2016 by Gareth Halfacree

    Memory giant Micron has announced that it is on-track to begin mass production of GDDR5X memory modules by the summer, having completed small-run test manufacturing of its first-generation 8Gb modules.
    Micron's Graphics DRAM Design Centre in Munich has confirmed these speeds as entirely achievable, boasting of hitting 13Gb/s in its first production run of 20nm 8Gb (1GB) GDDR5X modules.

    I agree with you that at this point the limiting factor is going to be the GDDR5X memory.  The actual GPU itself while on a new process for GPUs is not a new process for CPU's, as i said before a lot of that knowledge absolutely transfers over, so, i suspect nvidia isn't going to have huge issues with yields at the time they're able to get large quantities of gddr5x in their hands.

    While the discrete GPUs we're talking about aren't the first chips at 14/16 nm, they might well be the first high-power chips on TSMC 16 nm or Samsung 14 nm.  There are some previous low power chips, most notably the Apple A9 on both nodes, or rather, earlier variants of them.  But there are things that can go wrong when you try to burn 100 W that are fine at 1 W, so AMD and Nvidia surely ran into new problems.  I'm not aware of any previous chips on either node that burn more than 10 W, though presumably Xilinx has to produce their higher end FPGAs somewhere.

    If you're thinking of Intel's experience on 14 nm, that's a different process node entirely.  Whatever problems Intel ran into and had to fix presumably weren't shared with TSMC or Samsung.
Sign In or Register to comment.