I do think it's technically possible for nVidia to do. I think nVidia held back a good bit on Turing on the tech side (and tried to make up for it on the price tag, ((excuse my poor joke there))), and I think it's plausible that nVidia could put out a component that's as fast as these rumors seem to lean toward .
And it wouldn't be too far of a stretch, the 1070 pretty well matched a 980Ti...
But there has been a big gulf between a x70 and an x60 in terms of pricing, or at least historically. And a vast difference between a 2080Ti pricing and ... pretty much everything else.
It would be ... difficult ... for nVidia to come out on generation later with a $299 card that equals a $1200 card that was, just the day before, current on the market. Of course, nothing would be stopping nVidia from re-valuing their current tech tiers, and now a 3060 may have an MSRP of $599.
If i had purchased a 2080Ti and that happened, I don't know. On one hand, you'd have the 3080Ti to crow over. On the other, that 3060 just tanked the value of your 2080Ti. You always expect some depreciation, but a drop by a factor of 4 is a bitter pill.
If a 3060 beats a 2080TI then it might be the time to buy a new computer. My GPU is a 1060 and the CPU is i7 level but really old.
Reading somewhat between the lines (and fanboy hype)
The 3060 may be faster than a 2080Ti
**in Raytracing only.
It probably won't be faster on traditional rasterization - current figures point to about a 15-30% generational increase in traditional GPU tasks, or more or less on par with other generational changes, but a much more efficient RTX engine for raytracing.
A die shrink to a process node with 2.5x transistor density as compared to 12 nm FFN could mean that you get about the same number of transistors as TU102 in a 300 mm^2 die. That's not an outlandish die size for a *60 GPU; it's smaller than those used in the GTX 260, 460, 560, and RTX 2060, and not much larger than those used for the GTX 660, 760, and 1660. Having the same transistor count as the RTX 2080 Ti could plausibly produce a GPU competitive it, at least if Nvidia doesn't waste a bunch of space on tensor cores.
I'm not predicting that it will or won't play out that way. But unlike some rumors that aren't technically feasible (some of which I've blasted as such in this very thread), an RTX 3060 or whatever that is competitive with an RTX 2080 Ti is plausible.
Of course while the 3080 and 3070 will be coming out soonish, the 3060 is next year. Essentially if you plan to make a build this black friday, you'll be able to get the 70 and 80 but not the 60.
Of course while the 3080 and 3070 will be coming out soonish, the 3060 is next year. Essentially if you plan to make a build this black friday, you'll be able to get the 70 and 80 but not the 60.
Unless you have inside information and are risking getting fired from your job by leaking it, I wouldn't put much credibility in the rumored launch dates. Besides, even Nvidia doesn't know when parts are going to launch until they're ready to place the big production order. The exact number of respins that you'll need before you're happy with yields isn't predictable, and each respin can add another two months.
Comments
If a 3060 beats a 2080TI then it might be the time to buy a new computer. My GPU is a 1060 and the CPU is i7 level but really old.
The 3060 may be faster than a 2080Ti
It probably won't be faster on traditional rasterization - current figures point to about a 15-30% generational increase in traditional GPU tasks, or more or less on par with other generational changes, but a much more efficient RTX engine for raytracing.
I'm not predicting that it will or won't play out that way. But unlike some rumors that aren't technically feasible (some of which I've blasted as such in this very thread), an RTX 3060 or whatever that is competitive with an RTX 2080 Ti is plausible.