Hello, i have a question. Recently i dove back to fiddling with streaming recording videos with my RX 580. But the quality of the videos were just not what i wanted. And it bothered me that nobody can explain me why it is like that. I started looking into encoders itself of AMD and Nvidia and have noticed that HEVC is superior to AMD encoders. Which was really big surprise to me. Why AMD was not able to keep up with Nvidia on encoders boggles my mind. Apparently HEVC is so good that it does not impact pc performance in any way and outputs super good quality videos, there is nothing better than that.
So my AMD experts, when you expect AMD tackle this problem? Because this feature is just must have these days.
Comments
If the underlying issue is that it is using some encoding that isn't as good, it could be a simple problem of using a GPU that is too old. Your GPU chip released in mid-2016, which means that it pretty much had to tape out by mid-2015. That means that in order to have hardware acceleration for a particular encoding (which is something that GPUs rely heavily upon), all the work had to be done by then.
While we're at it, what Nvidia GPU are you comparing it to? The contemporary Nvidia cards to your GPU is Pascal, or the GeForce 1000 series. If you're comparing it to Turing, then those are much newer. As to the question of whether AMD will address it, the better question is whether they already have in Navi.
It's also possible that an updated video encoding block is simply one of the things that AMD decided to skip to save money. Recall that back when Polaris launched in 2016, AMD was in serious trouble and trying to save money anywhere they could to stave off bankruptcy. Ryzen saved the company, and now they're able to invest where they should.
I have seen video about new comparisons. Let me...
There, i found it. You can clearly see tests.
I even stopped using gpu encoder and switched to cpu, the quality is better.
An ASIC, or Application-Specific Integrated Circuit, is a chip designed to do exactly one very narrow thing, but to do it extremely well. An ASIC can't just do video encoding in that generality. It can do video encoding of this particular list of (possibly only one) encoding standards at this particular list of resolutions and frame rates. Maybe they can put a little bit more flexibility into it by saying up to this resolution and up to this frame rate. But as soon as you go outside of what an ASIC can do, it can't do what you need at all.
So why build an ASIC if it's so restrictive? Because you can't beat an ASIC. Just about anything that an ASIC can do could be done on a CPU. But the ASIC might be ten times as fast as the CPU. Or a hundred times as fast. The CPU has all sorts of logic that is there because some algorithms will need it, but the ASIC can discard everything not needed for its one special purpose.
GPUs aren't ASICs and haven't been for about twenty years now. But they can still have small portions of the chip for dedicated purposes. For things that are used heavily enough with very specialized algorithms known in advance, CPUs and GPUs sometimes do include small portions of the chip to do that one particular thing. GPUs have quite a few of these, from video decoding to hardware tessellation to primitive assembly.
While video decoding is ubiquitous, video encoding is far less commonly used. Thus, GPUs have had video decoding blocks for many years and the GPU vendors work very hard to make sure that they work right. The first video encoding block in consumer hardware was introduced by Intel in their Sandy Bridge CPUs about nine years ago. Since then, AMD and Nvidia have added their own video encoding blocks to their GPUs.
But just because hardware has some specialized block for the algorithm you want to do doesn't mean that you have to use it. It's possible that AMD's video encoding block is simply awful, and that's what you're seeing. I'd regard that as unlikely. It's more likely that it's not getting used properly, or perhaps even not at all.
There are a variety of reasons why that could be the case. One simple one is shenanigans where someone found a way to arbitrarily disable it. It's also possible that the software you're using for streaming simply isn't written to use it properly. AMD, Nvidia, and Intel all have very different hardware encoding blocks that would need to be called in different ways, and their capabilities can vary by generation of hardware, too.
It's also possible that what you're doing simply goes outside the bounds of what AMD's hardware encoding block is built to do. For example, if AMD designed their block to be able to handle 1920x1080 at up to 120 Hz, 2560x1440 at up to 60 Hz, or 3840x2160 at up to 30 Hz, and you want to encode 3840x2160 at 60 Hz, then the video encoding block just wouldn't be able to do it. At that point, there are several options, all of which are bad. It could return flagrantly broken output, such as a black screen. It could forcibly reduce settings to something that the encoding block can handle--which could result in poor image quality. It could crash to desktop. Or it could just not use the video encoding block at all and drop back to a software implementation.
The problem with the software implementation is that neither the CPU nor the GPU has the horsepower to keep pace with an ASIC. That's fine if you don't mind taking an hour to do a high quality encoding of a five minute video, but that doesn't work for game streaming. You would likely have to do something much simpler, and that gives you poor image quality. That's why GPUs have video encoding blocks in the first place, after all: because a software implementation just can't keep up.
You might ask, why would they restrict what the video encoding block can handle? The answer is that it would take too much die space to handle proper video encoding for all of the monitors that can be plugged into the GPU simultaneously at their maximum possible frame rates and resolutions. Given a choice between capping what the encoder can handle to something that not one customer in a thousand will notice or making all games run 10% slower across the board, the choice is obvious.
You might ask, why would the CPU handle this better than the GPU? Aren't CPUs supposed to be slower than GPUs? But if it's a dedicated hardware block in the CPU or a dedicated hardware block in the GPU, the winner is the better hardware block. The only advantage that being in the GPU offers over being in the CPU is that it takes less PCI Express bandwidth and system memory bandwidth if you're pulling back already compressed data from the GPU than if you have to pull back the raw frames for the CPU to compress them. And even that probably doesn't matter very much.
It's possible that you've hit some corner case that Nvidia handles well and AMD doesn't. There really aren't very many people who care about video encoding, and I'm not one of them, so I don't keep track of which vendors handle it better. People who do end up in such corner cases sometimes have a good reason to have a strong preference for one particular GPU vendor. That could either be because AMD's video encoder doesn't work very well, or because it works fine for most people but you're using some particular settings or software or something that doesn't play nicely with it.
Unless you've compared what you're doing on an AMD GPU to an Nvidia GPU yourself, it's also possible that all you're seeing is some online rumors and shenanigans from people trying to make AMD look bad. Hardware vendors try to make their competitors look bad all the time, and fanboys will sometimes do that, too. If you have done that comparison yourself, and Nvidia works much better in your particular use case, then that's a compelling reason to go with Nvidia until that situation changes.
Brenics ~ Just to point out I do believe Chris Roberts is going down as the man who cheated backers and took down crowdfunding for gaming.
I want to see what amd new graphics cards are capable off in streaming. Nobody talks about it!
Brenics ~ Just to point out I do believe Chris Roberts is going down as the man who cheated backers and took down crowdfunding for gaming.