I consider Intel's GPUs vaporware until proven wrong.
Given Raja is on the case, I'm sure something will ship eventually, that it will be a giant heap of promises that delivers only on it's ability to disappoint.
Apparently Intel had some contest and offered the winners one of the new video cards. Now they're asking people to accept an Alder Lake CPU instead. That's not really a bait and switch, as they're saying that they'll still give the winners the promised video card rather than the CPU if they're willing to wait for it. But that sure makes it look like the launch isn't just around the corner if they can't even drum up a few hundred cards for winners of a contest.
It's also possible that Intel isn't pushing a wider launch just yet for reasons of drivers. Some early reviews of the A380 based on cards imported from China found that the drivers were catastrophically awful. A more recent review on TechPowerUp found that they've improved a lot, as the latest beta driver fixed most of the problems in the latest "stable" release driver.
I'd expect that most new generations of AMD and Nvidia video cards use drivers heavily derivative of the ones that the company made for previous generations. Yes, new architectures change some things, and sometimes they add new features. But that's surely a lot easier than trying to create a brand new driver software package nearly from scratch. Intel is basically doing exactly that.
They announced they are punting entirely on DX9 support in their drivers, relying on a Microsoft-supplied DX9->DX12 emulation layer instead. This came just a couple of days after Tom Peterson & Ryan Shrout announced that DX9 and DX11 were issues but they were committed to working through it. "Labor of Love" was the exact quote.
Yes, this is the first generation of discrete graphics, but they have been doing graphics since way before DX9 -- plenty of time for them to have adapted and created a unified driver even if you don't just redo the entire driver from scratch. So... yeah ... they take time to do right, but Intel has had ~plenty~ of time, I don't give them a pass on that at all.
The ball very obviously got dropped, badly. And it's put Intel in a very bad position.
And from all the news coming from Intel, it sure doesn't seem like the left hand is talking to the right hand, and that's probably been most of the problem the entire time.
For many years, Intel didn't particularly care about gaming performance. Sometime around 2010 or 2011, they gave a list of things that they were focused on when designing their GPUs, and games were conspicuously absent from the list. So yeah, there was a lot that they could have done years ago, but didn't. And that means a bigger backlog of things to fix now.
But there are also a lot of things that are genuinely new. For starters, in the past, Intel could always assume that the CPU and GPU shared the same physical memory pool so that you didn't have to worry about copying data back and forth. That's not true of a discrete GPU, and that makes a huge difference.
Furthermore, Intel has long had drivers for basic rendering, but making a full software package with a bunch of extra features (e.g., adjusting hardware clock speeds or forcing different settings for different games) is new. AMD and Nvidia have long had that, but Intel hasn't.
Ultimately, they're making progress, and a lot of games more or less work right now. But they've still got a long way to go:
It's possible that they don't want a large-scale launch until their drivers are in better shape. A video card that only works for 80% of the games that you want to play is a video card that doesn't work. If you buy a card in September and a lot of games don't work properly, you're going to be a lot more upset and for a lot longer than if you bought the same card in December and the drivers were decent.
As for a DirectX 9/12 emulation layer, whether that's reasonable depends tremendously on how big the performance hit is. New versions of APIs are commonly very close to being supersets of the old versions, so they might only have to provide ways to fix a handful of things that were deprecated or removed. Or it might be more substantial than that.
Even if it is a substantial performance hit, for old enough games, it might not matter. If you could get 300 frames per second with an optimized driver, but only get 200 frames per second with the emulated version, do you really care?
Comments
https://www.techpowerup.com/297760/intel-asks-xe-hpg-scavenger-hunt-winners-to-accept-a-cpu-in-lieu-of-graphics-card
Apparently Intel had some contest and offered the winners one of the new video cards. Now they're asking people to accept an Alder Lake CPU instead. That's not really a bait and switch, as they're saying that they'll still give the winners the promised video card rather than the CPU if they're willing to wait for it. But that sure makes it look like the launch isn't just around the corner if they can't even drum up a few hundred cards for winners of a contest.
I'd expect that most new generations of AMD and Nvidia video cards use drivers heavily derivative of the ones that the company made for previous generations. Yes, new architectures change some things, and sometimes they add new features. But that's surely a lot easier than trying to create a brand new driver software package nearly from scratch. Intel is basically doing exactly that.
Yes, this is the first generation of discrete graphics, but they have been doing graphics since way before DX9 -- plenty of time for them to have adapted and created a unified driver even if you don't just redo the entire driver from scratch. So... yeah ... they take time to do right, but Intel has had ~plenty~ of time, I don't give them a pass on that at all.
The ball very obviously got dropped, badly. And it's put Intel in a very bad position.
And from all the news coming from Intel, it sure doesn't seem like the left hand is talking to the right hand, and that's probably been most of the problem the entire time.
But there are also a lot of things that are genuinely new. For starters, in the past, Intel could always assume that the CPU and GPU shared the same physical memory pool so that you didn't have to worry about copying data back and forth. That's not true of a discrete GPU, and that makes a huge difference.
Furthermore, Intel has long had drivers for basic rendering, but making a full software package with a bunch of extra features (e.g., adjusting hardware clock speeds or forcing different settings for different games) is new. AMD and Nvidia have long had that, but Intel hasn't.
Ultimately, they're making progress, and a lot of games more or less work right now. But they've still got a long way to go:
https://videocardz.com/newz/intel-arc-a380-gpu-tested-in-50-games-compatibility-better-than-expected-for-a-newcomer
It's possible that they don't want a large-scale launch until their drivers are in better shape. A video card that only works for 80% of the games that you want to play is a video card that doesn't work. If you buy a card in September and a lot of games don't work properly, you're going to be a lot more upset and for a lot longer than if you bought the same card in December and the drivers were decent.
As for a DirectX 9/12 emulation layer, whether that's reasonable depends tremendously on how big the performance hit is. New versions of APIs are commonly very close to being supersets of the old versions, so they might only have to provide ways to fix a handful of things that were deprecated or removed. Or it might be more substantial than that.
Even if it is a substantial performance hit, for old enough games, it might not matter. If you could get 300 frames per second with an optimized driver, but only get 200 frames per second with the emulated version, do you really care?