We knew quite a bit about Volta long before Pascal hit the market. Pascal is now aging and Volta is about to be released, and ...
There is remarkably little that I have seen being said. I mean there is some talk of active ray-tracing, but certainly no 12xx talk and what the cards will look like and when. We had all of that far earlier than this in the prior generations.
Do I take it 11xx will be it for relatively quite a long time as opposed to there being a new significant 12xx series a couple of years out?
Comments
*edit* Turing*
Real-time raytracing has been a thing for a decade. The problem was that you need an architecture like AMD had, and it will decrease the quality of the visuals for little benefit.
Or whatever other rumor wants to be floated next week.
Volta is out in the compute-oriented cards (Tesla/Titan-V), so there's a good case to be made for 1100 being Volta. But it's also still at insane price points (maybe yields, maybe because Pascal is still selling out), and Pascal is still selling out... so maybe you just refresh Pascal into an 1100 to keep people from waiting for Volta (at least for a while).
Pascal right now is still King of the Hill. The Intel model in this situation was to milk it for everything that it's worth - don't release big jumps, because that only makes it harder to one-up yourself in the future. Do small but measurable performance increases on a steady cadence to keep the sales numbers rolling. That served Intel well for a long time, and AMD has just now caught up. Not saying I like it, but it makes for smart business sense.
nVidia as a company, though, has a reputation for making those big leaps, and their stock price has been rewarded because they have been able to deliver for a long while now.
If I were Jensen Haung...
Computer graphics SKU market right now is a mess, largely because of the influence of mining. That needs to be addressed. nVidia is "sort of" doing that, they have a new 1050 SKU that shouldn't appeal to miners. They need to come out with some product that specifically does appeal to miners, so the rest of the GPU lineup can get unconstrained. Or just wait for Bitmain to do it for them and watch their mining market dry up and blow away (which may already be done).
A mining-specific SKU would probably go a long way. No video out required, optimized clocks and firmware, datacenter-grade high density cooling. Get them to where they are competitive in price per hash, cut out what miners don't need, and add in what they do need. Either release as a reference card for the AIBs to manufacture, or get your first-party OEM to crank them out and make them available for bulk order.
Then "soft lock" the drivers, so that consumer cards can mine, but only with 1 or 2 cards at a time (similar to SLI). That lets Joe Miner keep mining in his basement on his gaming rig, but the folks with 200 8-card risers that are turning their basements into 1MW datacenters from buying up all your stock out of Best Buy. Sure, there's ways around that, but with a mining-specific SKU that goes around that and offers other benefits, there's no incentive to put in that work.
RAM is going to be an issue no matter what. Supply is constrained for everything right now. Not a lot you can do about that, apart from get those smart engineers out there to keep optimizing your memory use algorithms and figure out ways to do more with less, or mix different types and create even more cache levels, or something. It's like Hard Drive space in the early 90's. It was very expensive per MB - most programs were only a few hundred kB at most, very optimized for space, and we had programs that would compress/decompress on the fly (those still exist, and are used on SSDs still). Not that GPU-grade RAM has ever been cheap or there has ever been a lot of it... but there's a big difference between it's just expensive, or it just isn't available at any cost.
Whatever comes after Pascal doesn't really matter, until you get this situation with mining and RAM resolved. Without that, it will just be more extremely high prices with almost no availability... you would be just as well served going out and getting the Titan-V, which is available now.
NVidia might want to decrease their chip size if they feel they're beating AMD by margin, but I don't think they'd want to withhold any performance.
https://wccftech.com/rumor-amd-navi-mainstream-gpu-to-have-gtx-1080-class-performance/
Now, there are a variety of reasons why those simplifying assumptions are wrong, but being off by 20% here and 50% there isn't the difference between 197 nm on a side and a 12 nm process node.
As others have suggested, you might just see an improved Pascal for the next release.
I completely agree with Torval, 4k is still a ways off for most gamers. I am using a 40" 4k TV for my main monitor, using it at 1440 resolution and have few issues. My expectations are that I will not bother to upgrade my GPU this year unless something like Cleffy surmises happens with AMD.
https://www.pcgamesn.com/nvidia-next-gen-gpu-hot-chips
That means you will be lucky to find one of these cards for Xmas.
It's good to know that we're getting some info on NVidia's next gen GPUs, but that doesn't necessarily mean a product reveal or release date announcement.
Tom's Hardware writes:
"Nvidia's presentation doesn't directly signify that the new GPUs will come to market soon. We first learned about AMD's Zen microarchitecture at the event in August 2016, but those processors didn't land in desktop PCs until March 2017. Conversely, many companies provide more detail on shipping products, so there's a chance that Nvidia's latest GPUs could already be on the market when the presentation takes place"
https://www.tomshardware.com/news/nvidia-gtx-1180-gtx-2080-hot-chips-30,37152.html
We might have more info on the architecture by then, but shipping products, sorry if I scoff at that.
It'll still be a year+ for AMD high end and with Vega being a disappointing failure unable to capture any high end market history will repeat again with folks in a long wait, later to market offering.
Vega is just now starting to show up near MSRP. It's been sold out/limited availability, almost constantly.
Now, those on Team Green will say it's because of production problems - AMD can't make enough, yields are bad, HBM availability sucks, whatever. (Let's ignore the fact that nVidia is in almost the same boat). And they do have some evidence - Steam Hardware survey (however much stock you put into that) hasn't shown a big uptick in AMD ownership (it is up, but it's still not a big number).
Those on Team Red will say "WTF Miners suck". And they have a point there. AMD Vega is among the most desirable card for mining. And they are buying them all. And that would be one logical explanation as to why Steam Survey numbers haven't budged much.
AMD is selling every card they can make, and they are almost all selling for more than AMD's recommended price point. From AMD's point of view, it's hardly a failure. It is disappointing because the intended audience - gamers - can't hardly get a hold of them though.
I don't think it's a yield problem, either--or at least not GPU yields. (There could be issues with HBM2 yields.) If the problem were yields, we'd expect to see more cut-down versions of them. Instead, there are only two bins, one of which is fully functional, and the other not really that far from it. Besides, the Polaris 10 cards were in stock just fine for a number of months before the miners starting grabbing them all, and to have good yields for a while, then suddenly have yield problems that last for an entire year just doesn't happen. (Shorter-lived yield problems because the fab goofed and wrecked a batch can happen.)
Nvidia's CEO just said that the next GeForce GPU launch will be "a long time from now".
Titan V is available today, if you want to pay for it ($3,000). So there have been comparisons of Volta versus Pascal.
https://www.hardocp.com/article/2018/03/20/nvidia_titan_v_video_card_gaming_review
Yeah, Volta will be a decent bump over Pascal, if we are just looking at Architecture vs Architecture and not specific card versus specific card.
It's not that Nvidia and AMD just want to milk the situation. It's that they've effectively had one new process node to move to in the last six years. Die shrinks are what drive the long-term Moore's Law gains, and performance gains will be very slow without them.
If Nvidia could release a new generation soon that was 30% faster than the old for the same cost to build, they probably would. Similarly if they could release a new generation that was the same speed as the old but cost 30% less to build. But if all that the new generation would offer in the same cost of production is 5% more performance than before, then of course they're not going to spend a zillion dollars to create that.
The move to 7 nm will probably offer considerable gains over what is available today. We'll see new generations then. But you can't just magically launch cards on process nodes that aren't mature enough yet.
nVidia has never shied away from big dies.
The Volta in question has a lot of whatever-the-heck tensor cores are... which are presumably not all that useful for gaming, and presumably aren't affecting the gaming benchmark tests that have been done. Remove those, die size ~probably~ comes down to something more manageable. Speculation on my part, I know, but probably more true than not.
It could just be a matter of scale: Titan Xp has 3840 shading units, V has 5120 shading units -- which is 30% more shading units for 30% more performance.... which sounds like a wash, except your getting that 30% more for no net increase in power draw, so there's the combined benefit of the 12nm vs 16nm node and Volta vs Pascal architecture. How much is node and how much is architecture - I couldn't say.