Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

AMD GPU Encoders inferior to HEVC

MrMonolitasMrMonolitas Member UncommonPosts: 263
Hello, i have a question. Recently i dove back to fiddling with streaming recording videos with my RX 580. But the quality of the videos were just not what i wanted. And it bothered me that nobody can explain me why it is like that. I started looking into encoders itself of AMD and Nvidia and have noticed that HEVC is superior to AMD encoders. Which was really big surprise to me. Why AMD was not able to keep up with Nvidia on encoders boggles my mind. Apparently HEVC is so good that it does not impact pc performance in any way and outputs super good quality videos, there is nothing better than that.
So my AMD experts, when you expect AMD tackle this problem? Because this feature is just must have these days.
Ozmodan

Comments

  • MrMonolitasMrMonolitas Member UncommonPosts: 263
    I never said that i cant stream. Wanted to just point out that the encoder is complete garbage comparing to nvidia. And maybe someone has insiders scoop when they are planing to fix that.
    Ozmodan
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    How do you know that the recording quality is inferior?  In what way is it inferior?  Worse image quality?  Larger size?

    If the underlying issue is that it is using some encoding that isn't as good, it could be a simple problem of using a GPU that is too old.  Your GPU chip released in mid-2016, which means that it pretty much had to tape out by mid-2015.  That means that in order to have hardware acceleration for a particular encoding (which is something that GPUs rely heavily upon), all the work had to be done by then.

    While we're at it, what Nvidia GPU are you comparing it to?  The contemporary Nvidia cards to your GPU is Pascal, or the GeForce 1000 series.  If you're comparing it to Turing, then those are much newer.  As to the question of whether AMD will address it, the better question is whether they already have in Navi.

    It's also possible that an updated video encoding block is simply one of the things that AMD decided to skip to save money.  Recall that back when Polaris launched in 2016, AMD was in serious trouble and trying to save money anywhere they could to stave off bankruptcy.  Ryzen saved the company, and now they're able to invest where they should.
    OzmodanGdemami
  • AmazingAveryAmazingAvery Age of Conan AdvocateMember UncommonPosts: 7,188
    Hello, i have a question. Recently i dove back to fiddling with streaming recording videos with my RX 580. But the quality of the videos were just not what i wanted. And it bothered me that nobody can explain me why it is like that. I started looking into encoders itself of AMD and Nvidia and have noticed that HEVC is superior to AMD encoders. Which was really big surprise to me. Why AMD was not able to keep up with Nvidia on encoders boggles my mind. Apparently HEVC is so good that it does not impact pc performance in any way and outputs super good quality videos, there is nothing better than that.
    So my AMD experts, when you expect AMD tackle this problem? Because this feature is just must have these days.
    Are you referring to AMD ReLive Vs Nvidia Shadowplay in terms of capturing and streaming?



  • MrMonolitasMrMonolitas Member UncommonPosts: 263
    edited April 2020
    I am using OBS AMD AVC Encoder. Im not comparing it to turing. Even in navi it is still inferior. By that i mean quality of output is still worse than turing. Mostly you can notice in fast paced game play videos. Thats what people say about amd encoders, that its just bad.

    I have seen video about new comparisons. Let me...
    There, i found it. You can clearly see tests.


    I even stopped using gpu encoder and switched to cpu, the quality is better.

    GdemamiOzmodan
  • QuizzicalQuizzical Member LegendaryPosts: 25,483
    There are a lot of things that could be going on here.  Most of them can be summarized as, you can't beat an ASIC.

    An ASIC, or Application-Specific Integrated Circuit, is a chip designed to do exactly one very narrow thing, but to do it extremely well.  An ASIC can't just do video encoding in that generality.  It can do video encoding of this particular list of (possibly only one) encoding standards at this particular list of resolutions and frame rates.  Maybe they can put a little bit more flexibility into it by saying up to this resolution and up to this frame rate.  But as soon as you go outside of what an ASIC can do, it can't do what you need at all.

    So why build an ASIC if it's so restrictive?  Because you can't beat an ASIC.  Just about anything that an ASIC can do could be done on a CPU.  But the ASIC might be ten times as fast as the CPU.  Or a hundred times as fast.  The CPU has all sorts of logic that is there because some algorithms will need it, but the ASIC can discard everything not needed for its one special purpose.

    GPUs aren't ASICs and haven't been for about twenty years now.  But they can still have small portions of the chip for dedicated purposes.  For things that are used heavily enough with very specialized algorithms known in advance, CPUs and GPUs sometimes do include small portions of the chip to do that one particular thing.  GPUs have quite a few of these, from video decoding to hardware tessellation to primitive assembly.

    While video decoding is ubiquitous, video encoding is far less commonly used.  Thus, GPUs have had video decoding blocks for many years and the GPU vendors work very hard to make sure that they work right.  The first video encoding block in consumer hardware was introduced by Intel in their Sandy Bridge CPUs about nine years ago.  Since then, AMD and Nvidia have added their own video encoding blocks to their GPUs.

    But just because hardware has some specialized block for the algorithm you want to do doesn't mean that you have to use it.  It's possible that AMD's video encoding block is simply awful, and that's what you're seeing.  I'd regard that as unlikely.  It's more likely that it's not getting used properly, or perhaps even not at all.

    There are a variety of reasons why that could be the case.  One simple one is shenanigans where someone found a way to arbitrarily disable it.  It's also possible that the software you're using for streaming simply isn't written to use it properly.  AMD, Nvidia, and Intel all have very different hardware encoding blocks that would need to be called in different ways, and their capabilities can vary by generation of hardware, too.

    It's also possible that what you're doing simply goes outside the bounds of what AMD's hardware encoding block is built to do.  For example, if AMD designed their block to be able to handle 1920x1080 at up to 120 Hz, 2560x1440 at up to 60 Hz, or 3840x2160 at up to 30 Hz, and you want to encode 3840x2160 at 60 Hz, then the video encoding block just wouldn't be able to do it.  At that point, there are several options, all of which are bad.  It could return flagrantly broken output, such as a black screen.  It could forcibly reduce settings to something that the encoding block can handle--which could result in poor image quality.  It could crash to desktop.  Or it could just not use the video encoding block at all and drop back to a software implementation.

    The problem with the software implementation is that neither the CPU nor the GPU has the horsepower to keep pace with an ASIC.  That's fine if you don't mind taking an hour to do a high quality encoding of a five minute video, but that doesn't work for game streaming.  You would likely have to do something much simpler, and that gives you poor image quality.  That's why GPUs have video encoding blocks in the first place, after all:  because a software implementation just can't keep up.

    You might ask, why would they restrict what the video encoding block can handle?  The answer is that it would take too much die space to handle proper video encoding for all of the monitors that can be plugged into the GPU simultaneously at their maximum possible frame rates and resolutions.  Given a choice between capping what the encoder can handle to something that not one customer in a thousand will notice or making all games run 10% slower across the board, the choice is obvious.

    You might ask, why would the CPU handle this better than the GPU?  Aren't CPUs supposed to be slower than GPUs?  But if it's a dedicated hardware block in the CPU or a dedicated hardware block in the GPU, the winner is the better hardware block.  The only advantage that being in the GPU offers over being in the CPU is that it takes less PCI Express bandwidth and system memory bandwidth if you're pulling back already compressed data from the GPU than if you have to pull back the raw frames for the CPU to compress them.  And even that probably doesn't matter very much.

    It's possible that you've hit some corner case that Nvidia handles well and AMD doesn't.  There really aren't very many people who care about video encoding, and I'm not one of them, so I don't keep track of which vendors handle it better.  People who do end up in such corner cases sometimes have a good reason to have a strong preference for one particular GPU vendor.  That could either be because AMD's video encoder doesn't work very well, or because it works fine for most people but you're using some particular settings or software or something that doesn't play nicely with it.

    Unless you've compared what you're doing on an AMD GPU to an Nvidia GPU yourself, it's also possible that all you're seeing is some online rumors and shenanigans from people trying to make AMD look bad.  Hardware vendors try to make their competitors look bad all the time, and fanboys will sometimes do that, too.  If you have done that comparison yourself, and Nvidia works much better in your particular use case, then that's a compelling reason to go with Nvidia until that situation changes.
    OzmodanGdemamiKyleran
  • VrikaVrika Member LegendaryPosts: 7,973
    Quizzical said:
    It's also possible that an updated video encoding block is simply one of the things that AMD decided to skip to save money. 
    This. NVidia's GPUs really have better encoders than otherwise equivalent AMD GPUs.
     
  • MrMonolitasMrMonolitas Member UncommonPosts: 263
    It is a shame. If they had good encoders as Nvidia, i would be gladly take another AMD graphics card. Now i am kind of stuck with green option. Hopefully, when time to upgrade i AMD sort it out. Thanks for insights guys. Not sure what other solution i can have. People suggesting me to get two pc build, or Nvidia graphics card... sigh
    Asm0deus
  • RidelynnRidelynn Member EpicPosts: 7,383
    Get a bajillion core AMD CPU and use CPU encoding?
    [Deleted User]
  • Asm0deusAsm0deus Member EpicPosts: 4,599
    edited May 2020
    Naw just stick to nvidia cards for now if you do lots of streaming and stick to using nvidia gpu encoding it does a really good job and no need to mess around with a second PC and the issues that goes with that. 

    Even if you're using a AMD with a 6 or 8 cores you will need those cores for playing the games, running OBS PLUS all the extras like streamelements and your scenes etc.

    For two PC builds it will cost more and your set up is more difficult and you can run into latency issues and the like etc... there's not really any need anymore to use two PC's for streaming!  It's more hassle than it's worth.


    Frankly it's really a non issue as you can simply use a AMD based cpu build with a nvidia gpu or if you have a recent enough ryzen like maybe 8 core or 12 core you can try cpu encoding.

    See here:

    There a valid reason the PC builds beyond lower end budget build all use nvidia cards..it's not just nvidia white knighting or favoritism.

    Could also be a problem with your net connection or obs settings if your stream look that bad.

    Using my old as the hill i5-750@4ghz and an old gtx760 my streams look good with these settings:

    Base canvas at 1080p cause it's my current res and I output 720p@60fps bicubic downscale filter.


    My old clunker does struggle with scenes though and the brand new games (have to turn down my settings) which is why I will be building a new rig at the end of the year when hopefully some of to coronapocalyse effect have went down a little.








    Post edited by Asm0deus on
    AmazingAveryOzmodanMrMonolitasKyleran

    Brenics ~ Just to point out I do believe Chris Roberts is going down as the man who cheated backers and took down crowdfunding for gaming.





  • MrMonolitasMrMonolitas Member UncommonPosts: 263
    edited November 2020
    Asm0deus said:
    Naw just stick to nvidia cards for now if you do lots of streaming and stick to using nvidia gpu encoding it does a really good job and no need to mess around with a second PC and the issues that goes with that. 

    Even if you're using a AMD with a 6 or 8 cores you will need those cores for playing the games, running OBS PLUS all the extras like streamelements and your scenes etc.

    For two PC builds it will cost more and your set up is more difficult and you can run into latency issues and the like etc... there's not really any need anymore to use two PC's for streaming!  It's more hassle than it's worth.


    Frankly it's really a non issue as you can simply use a AMD based cpu build with a nvidia gpu or if you have a recent enough ryzen like maybe 8 core or 12 core you can try cpu encoding.

    See here:

    There a valid reason the PC builds beyond lower end budget build all use nvidia cards..it's not just nvidia white knighting or favoritism.

    Could also be a problem with your net connection or obs settings if your stream look that bad.

    Using my old as the hill i5-750@4ghz and an old gtx760 my streams look good with these settings:

    Base canvas at 1080p cause it's my current res and I output 720p@60fps bicubic downscale filter.


    My old clunker does struggle with scenes though and the brand new games (have to turn down my settings) which is why I will be building a new rig at the end of the year when hopefully some of to coronapocalyse effect have went down a little.








    But again, you are using nvidia. These settings is not relevant to amd. But i understand what you are trying to say. I really dont like the idea of two pc streaming. Thanks for your input its helpful. 

    I want to see what amd new graphics cards are capable off in streaming. Nobody talks about it!
    Asm0deus
  • Asm0deusAsm0deus Member EpicPosts: 4,599
    ...snip...

    But again, you are using nvidia. These settings is not relevant to amd. But i understand what you are trying to say. I really dont like the idea of two pc streaming. Thanks for your input its helpful. 

    I want to see what amd new graphics cards are capable off in streaming. Nobody talks about it!

    Yeah I dare hope AMD might get their version of nvenc running as good as it can make the difference tween choosing a rtx 3xxx card vs one of the new cards has coming out.

    Going to wait on reviews though think they should start popping up around the 18,

    Brenics ~ Just to point out I do believe Chris Roberts is going down as the man who cheated backers and took down crowdfunding for gaming.





Sign In or Register to comment.