Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Intel Arc A750 Limited Edition Review | MMORPG.com

SystemSystem Member UncommonPosts: 12,599
edited March 2023 in News & Features Discussion

imageIntel Arc A750 Limited Edition Review | MMORPG.com

Intel is aiming to get into the dedicated graphics business alongside Nvidia and AMD, launching its Arc-line of GPUs last fall. We had the chance to put the Intel Arc A750 Limited Edition through its paces this past month, seeing if Intel's $250 GPU is worth a look compared to its competition.

Read the full story here


Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    If Intel is sending out review samples now, then I take it that they're happy with the state of their drivers, or at least, for their Windows drivers for graphics APIs running recent games. Their drivers have come a long way, but still have a long way to go.
  • DigDuggyDigDuggy Member RarePosts: 694
    I find it odd to have internal components be limited edition.  Nothing wrong with it, I just think of 'limited edition' things to be front facing.

  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Qbertq said:
    I find it odd to have internal components be limited edition.  Nothing wrong with it, I just think of 'limited edition' things to be front facing.

    Calling it "limited edition" is just marketing.  For whatever reason, Intel decided to label the video cards that they build themselves as "limited edition".  Those built by board partners get called something else.  If you're looking to buy an Intel Arc A770 on New Egg right now, you can buy the Intel "limited edition", the AsRock "phantom gaming", or the Acer "predator bifrost" version.
    DigDuggy
  • DigDuggyDigDuggy Member RarePosts: 694
    Quizzical said:
    Qbertq said:
    I find it odd to have internal components be limited edition.  Nothing wrong with it, I just think of 'limited edition' things to be front facing.

    Calling it "limited edition" is just marketing.  For whatever reason, Intel decided to label the video cards that they build themselves as "limited edition".  Those built by board partners get called something else.  If you're looking to buy an Intel Arc A770 on New Egg right now, you can buy the Intel "limited edition", the AsRock "phantom gaming", or the Acer "predator bifrost" version.
    Ah, gotcha, thx.
    aw
  • CooeCooe Newbie CommonPosts: 1
    FSR 2.2 fixes almost all of the motion ghosting issues endemic to prior versions (although to very game/engine dependant varying degrees), significantly reducing that advantage for ARC. It's also fully open source meaning adoption by game developers (esp. the latest versions) is just WAAAAAAAY faster.

    Aka, you simply DO NOT NEED AI for high quality temporal upscaling. (Something both AMD AND Epic Games seem to have realized). That's nothing but a load of utter horses**t perpetuated by Nvidia to justify the MASSIVE silicon die space they dedicated to Tensor Cores starting w/ Turing / the RTX 2000 series that were ACTUALLY added for professional compute reasons.

    (Nvidia would never tell you this, but modern DLSS 2 doesn't even significantly use the Tensor Cores anymore! They gave up on that after the disaster that was DLSS 1 which was PROPERLY Tensor accelerated.)

    Also XeSS only gives acceptable visual results ON ARC, with the compatibility layer used for other brand's GPU's looking ABSOLUTELY HORRIBLE, meaning game developers are EVEN LESS likely to add it to their games over FSR & DLSS. For this very reason, banking on widespread XeSS support is FREAKING IDIOTIC!!!

    Buying ARC Alchemist over RDNA 2 primarily "because it has hardware AI acceleration" is a fools errand of clueless stupidity.
  • GorweGorwe Member Posts: 1,593
    edited March 2023
    Quizzical said:
    If Intel is sending out review samples now, then I take it that they're happy with the state of their drivers, or at least, for their Windows drivers for graphics APIs running recent games. Their drivers have come a long way, but still have a long way to go.
    Not only this, but they started to, what's the term again, outsource(?) their boards to other manufacturers. Asus / ASRock are interested and ... Acer? I know it sounds weird. Have you got any news on that front?

    I am very interested in how the Battlemage is going to perform. Also, Celestial? Why not Cleric, fml.
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Cooe said:
    FSR 2.2 fixes almost all of the motion ghosting issues endemic to prior versions (although to very game/engine dependant varying degrees), significantly reducing that advantage for ARC. It's also fully open source meaning adoption by game developers (esp. the latest versions) is just WAAAAAAAY faster.

    Aka, you simply DO NOT NEED AI for high quality temporal upscaling. (Something both AMD AND Epic Games seem to have realized). That's nothing but a load of utter horses**t perpetuated by Nvidia to justify the MASSIVE silicon die space they dedicated to Tensor Cores starting w/ Turing / the RTX 2000 series that were ACTUALLY added for professional compute reasons.

    (Nvidia would never tell you this, but modern DLSS 2 doesn't even significantly use the Tensor Cores anymore! They gave up on that after the disaster that was DLSS 1 which was PROPERLY Tensor accelerated.)

    Also XeSS only gives acceptable visual results ON ARC, with the compatibility layer used for other brand's GPU's looking ABSOLUTELY HORRIBLE, meaning game developers are EVEN LESS likely to add it to their games over FSR & DLSS. For this very reason, banking on widespread XeSS support is FREAKING IDIOTIC!!!

    Buying ARC Alchemist over RDNA 2 primarily "because it has hardware AI acceleration" is a fools errand of clueless stupidity.
    Open source doesn't really make adoption much easier.  You may occasionally want to check on what a subroutine is doing internally, but for the most part, developers just want to call a library that works and not look at the source code for it.

    The reason that Nvidia includes tensor cores in their GPU is for the compute market.  And not really the entire compute market, but just a substantial subset of machine learning.  But Nvidia makes more money on the compute market than they do by selling GeForce cards to gamers.

    I think that Intel is after the compute market, and actually more interested in that market than consumer graphics.  But you can't compete in the GPU compute market if you don't have credible GPUs.

    Tensor cores are basically useless for consumer graphics.  Nvidia has shoehorned some code into using them for reasons of marketing, but that's hardly evidence that they're actually useful.  Tensor cores are actually an enormous pain to use by any means other than calling a library that Nvidia has written to use them.

    One reason to put compute stuff that is useless for consumers into consumer graphics parts is that sometimes the same GPU chips are used for both.  The same GPU chip as used in the GeForce RTX 3090 was also sold as the A40 and A10 for compute.  The GPU chip used in the GeForce RTX 4090 is also sold as the L40 for compute.

    But another reason is for the sake of driver support.  AMD's "ROCm" compute drivers are complete garbage, which makes the compute cards that rely on them pretty useless.  One reason for it is that hardly anyone has access to use them, so bugs don't get reported and fixed.  The handful of supported cards cost a fortune and are difficult to find and buy at all.

    In contrast, plenty of hobbyists and students have a GeForce card that is supported by Nvidia's GPU compute drivers.  If someone wants to try an Nvidia GPU for compute, it's easy to buy one, and a lot of people even happen to already have one.  That helps to build more of an ecosystem for people who use Nvidia GPUs for compute.

    Intel basically has to offer GPU compute support in their consumer parts in order to make a credible play for the GPU compute market.  And that means putting in things like their version of tensor cores that isn't actually useful for consumer graphics.  They're making progress, and at least on Linux, their driver support for GPU compute is already better than AMD's, though that's admittedly a very low bar.
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    edited March 2023
    Gorwe said:
    Quizzical said:
    If Intel is sending out review samples now, then I take it that they're happy with the state of their drivers, or at least, for their Windows drivers for graphics APIs running recent games. Their drivers have come a long way, but still have a long way to go.
    Not only this, but they started to, what's the term again, outsource(?) their boards to other manufacturers. Asus / ASRock are interested and ... Acer? I know it sounds weird. Have you got any news on that front?

    I am very interested in how the Battlemage is going to perform. Also, Celestial? Why not Cleric, fml.
    Intel's problem with Arc Alchemist is that it was greatly delayed, so that rather than competing against the RTX 3000 and RX 6000 series, it has to compete against their successors on a better process node.  I'm not sure which process node they're targeting with Battlemage, but being a node behind makes it very difficult to be competitive.  If they catch up, they'll probably be far more competitive.

    Yes, Intel is having board partners build their video cards just like AMD and Nvidia do, and probably for the same reasons.  If they get enough uptake from board partners, they might well stop selling their own cards directly and just let board partners handle it.  But they had to start somewhere, and at least initially, Intel didn't have any GPU board partners because they didn't have any discrete GPUs at all.
  • ShankTheTankShankTheTank Associate Editor / News ManagerMMORPG.COM Staff, Member RarePosts: 230
    Honestly, for a first generation effort in both ray tracing and machine learning, I’m liking what I’m seeing from Intel on their A series cards. AMD are 3 generations behind now on the ML features at this point in time. That’s just terrible and borderline inexcusable for such a veteran player in the PC hardware space. I’m optimistic for Intel in their GPU endeavors.
    Iselin
  • GorweGorwe Member Posts: 1,593
    edited March 2023
    Quizzical said:
    Gorwe said:
    Quizzical said:
    If Intel is sending out review samples now, then I take it that they're happy with the state of their drivers, or at least, for their Windows drivers for graphics APIs running recent games. Their drivers have come a long way, but still have a long way to go.
    Not only this, but they started to, what's the term again, outsource(?) their boards to other manufacturers. Asus / ASRock are interested and ... Acer? I know it sounds weird. Have you got any news on that front?

    I am very interested in how the Battlemage is going to perform. Also, Celestial? Why not Cleric, fml.
    Intel's problem with Arc Alchemist is that it was greatly delayed, so that rather than competing against the RTX 3000 and RX 6000 series, it has to compete against their successors on a better process node.  I'm not sure which process node they're targeting with Battlemage, but being a node behind makes it very difficult to be competitive.  If they catch up, they'll probably be far more competitive.

    Yes, Intel is having board partners build their video cards just like AMD and Nvidia do, and probably for the same reasons.  If they get enough uptake from board partners, they might well stop selling their own cards directly and just let board partners handle it.  But they had to start somewhere, and at least initially, Intel didn't have any GPU board partners because they didn't have any discrete GPUs at all.

    Those partners being? Acer and Asus? Anyone else?
  • VrikaVrika Member LegendaryPosts: 7,973
    edited March 2023
    Gorwe said:
    Quizzical said:
    Gorwe said:
    Quizzical said:
    If Intel is sending out review samples now, then I take it that they're happy with the state of their drivers, or at least, for their Windows drivers for graphics APIs running recent games. Their drivers have come a long way, but still have a long way to go.
    Not only this, but they started to, what's the term again, outsource(?) their boards to other manufacturers. Asus / ASRock are interested and ... Acer? I know it sounds weird. Have you got any news on that front?

    I am very interested in how the Battlemage is going to perform. Also, Celestial? Why not Cleric, fml.
    Intel's problem with Arc Alchemist is that it was greatly delayed, so that rather than competing against the RTX 3000 and RX 6000 series, it has to compete against their successors on a better process node.  I'm not sure which process node they're targeting with Battlemage, but being a node behind makes it very difficult to be competitive.  If they catch up, they'll probably be far more competitive.

    Yes, Intel is having board partners build their video cards just like AMD and Nvidia do, and probably for the same reasons.  If they get enough uptake from board partners, they might well stop selling their own cards directly and just let board partners handle it.  But they had to start somewhere, and at least initially, Intel didn't have any GPU board partners because they didn't have any discrete GPUs at all.

    Those partners being? Acer and Asus? Anyone else?
    Currently Acer, Asrock, Gigabyte and Gunnir have released Alchemist desktop GPUs.

    https://www.techpowerup.com/gpu-specs/arc-a770.c3914
    https://www.techpowerup.com/gpu-specs/arc-a380.c3913

    But unless you're planning on buying this generation Intel board, it doesn't matter much who Intel's partners are at the moment. New partners can join up quickly enough if Intel's next generation looks like it would get large sales numbers.
    Gorwe
     
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    Honestly, for a first generation effort in both ray tracing and machine learning, I’m liking what I’m seeing from Intel on their A series cards. AMD are 3 generations behind now on the ML features at this point in time. That’s just terrible and borderline inexcusable for such a veteran player in the PC hardware space. I’m optimistic for Intel in their GPU endeavors.
    The lack of machine learning support is a problem for AMD's compute cards, but it's irrelevant to the consumer space.  And really, when it comes to selling compute cards, silicon is the least of AMD's problems.
    Gorwe
  • IselinIselin Member LegendaryPosts: 18,719
    Not sold on their commitment to graphics cards yet - time will tell - but It would be nice to have a viable third player in the market bringing meaning back to "bang for the buck" in the outrageously overpriced current market.
    Gorwe
    "Social media gives legions of idiots the right to speak when they once only spoke at a bar after a glass of wine, without harming the community ... but now they have the same right to speak as a Nobel Prize winner. It's the invasion of the idiots”

    ― Umberto Eco

    “Microtransactions? In a single player role-playing game? Are you nuts?” 
    ― CD PROJEKT RED

  • GorweGorwe Member Posts: 1,593
    Iselin said:
    Not sold on their commitment to graphics cards yet - time will tell - but It would be nice to have a viable third player in the market bringing meaning back to "bang for the buck" in the outrageously overpriced current market.
    Yes. That's exactly why I am eager to see how their first true foray, the Battlemage, is going to turn out.
  • Agatha767Agatha767 Newbie CommonPosts: 4
    FSR 2.2 fixes almost all of the motion ghosting issues endemic to prior versions (although to very game/engine dependant varying degrees), significantly reducing that advantage for ARC. It's also fully open source meaning adoption by game developers (esp. the latest versions) is just WAAAAAAAY faster.

    Aka, you simply DO NOT NEED AI for high quality temporal upscaling. (Something both AMD AND Epic Games seem to have realized). That's nothing but a load of utter horses**t perpetuated by Nvidia to justify the MASSIVE silicon die space they dedicated to Tensor Cores starting w/ Turing / the RTX 2000 series that were ACTUALLY added for professional compute reasons.

    (Nvidia would never tell you this, but modern DLSS 2 doesn't even significantly use the Tensor Cores anymore! They gave up on that after the disaster that was DLSS 1 which was PROPERLY Tensor accelerated.)

    Also XeSS only gives acceptable visual results ON ARC, with the compatibility layer used for other brand's GPU's looking ABSOLUTELY HORRIBLE, meaning game developers are EVEN LESS likely to add it to their games over FSR & DLSS. For this very reason, banking on widespread XeSS support is FREAKING IDIOTIC!!!

    Buying ARC Alchemist over RDNA 2 primarily "because it has hardware AI acceleration" is a fools errand of clueless stupidity.
    FSR 2.2 fixes almost all of the motion ghosting issues but not at all, but it's till my choice.
Sign In or Register to comment.