Review: AMD Vega 56 8GB Graphics Card (PowerColor Red Devil Vega 56 8GB)

It has been a rocky road with the Vega, launched late and when launched, low availability. AMD has been trying to stay on top of the launch and sales, which placed their board partners in a second-row position really. We’ve stated it from the beginning, Vega ain’t bad at all, sure it might need some more Watts and proper cooling, but it’s exactly that last segment where custom design AIB partner cards can really make a difference.

One of these partners is Tul, Powercolor. AMD seems to have backed out on forced reference design sales and is now delivering the actual GPUs to its partners to do their thing. We received their Red Devil Radeon RX Vega 56 as our reference card, and in our opinion, the Vega 56 is the better deal value for money wise. This card is maybe 10% slower compared to the Vega 64, yet is cheaper and will still get you into that Quad HD resolution performance gaming bracket. Depending on the game workload and title, the performance sits in-between the GeForce GTX 1070 and 1080, which is a mighty good spot to be in at this price. There’s more to it than just that though, the Vega 56 is also showing a lower TDP and temperature.

The numbers in the Vega suffixes are based on the actual shader clusters, one (56) with 3584 and the other (64) with 4096 shader processors. In this review we take a look at that PowerColor Radeon RX Vega 56 8GB, in particular, the Red Devil edition. The GPU has 3584 stream processors and is clocked at a boost-able up-to 1526 MHz and has been fitted with 8GB of HBM2 graphics memory. The PowerColor Red Devil Vega 56 does not come with one, not even two, but with three BIOSes, and you can address all three modes:

  • Silent mode with Mute Fan technology ensures quiet operation.
  • Std. mode pursues balanced gaming experience, low temperature with good OC capability
  • OC mode for gamers to push the limit of Red Devil

It is a product with a massive three-slot / three-fan cooler design. The card is designed with two 8pin power connectors for its 12 power phases.

The big challenge for the RX Vega 56 is to push past the similarly priced Nvidia competition. Just as the RX Vega 64 is priced to go head-to-head with the GTX 1080, the RX Vega 56 was released into the wild with the GTX 1070 in its sights. Unfortunately, Nvidia hit back with the similarly unavailable GTX 1070 Ti.

Like the RX Vega 64’s attempts to overthrow the GTX 1080, it’s rather difficult to call an outright winner. As we said about the flagship Vega GPU, it’s an architecture which seems to have been primarily designed for a gaming future that’s yet to have been created. As such, it’s rather lacklustre in its legacy gaming performance.


With games built using the last-gen DirectX 11 API the second-class Vega is left trailing in the wake of the GTX 1070, let alone the GTX 1070 Ti. But when you start to bring in tests based around the newer DirectX 12, or Vulkan, instruction sets, Vega’s modern architectural design allows it to take the lead.

Battlefield 4 tested in 1080 resolution.

Unfortunately, despite DX11 now being very much a legacy API, with DX12 being over a year old, most PC games are still being released using the older system. And that means for many PC games that come out over the next six months, at least, the GTX 1070 is likely to retain that performance advantage.

Battlefield running in 4K resolution.

Granted, Vega’s performance improvements in DX12 and Vulkan is encouraging for how it’ll fare down the line, as they become the dominant APIs, but right now anyone lucky enough to be able to find the RX Vega 56 for a decent price is still going to be paying the same money for a card that struggles to beat a smaller, more efficient, year-old GPU in pretty much every game that’s in their Steam library.

That’s not the only competition for the RX Vega 56, however, as there’s also the small matter of fratricide. As is its wont, despite the US$100 price difference in their relative SEP, AMD hasn’t made sweeping changes to the core configuration of the two Vega variants. Essentially, the RX Vega 56 is simply operating with 12.5% fewer cores, yet in performance terms is only ever around 7-10% slower.

That means there’s a good chance most gamers will end up going for this mildly chopped down Vega instead of the much more expensive card. Or there would be if the cards were consistently available for reasonable prices. We’ve seen retailers where there are RX Vega 64 SKUs available for around the same price, sometimes less, than that shop’s cheapest available RX Vega 56.

Yeah, Vega’s weird.


The overall design of the Vega architecture seems to have been about laying a marker in the sand, defining a branching point for future generations of their GPU technology. It’s almost sacrificed legacy gaming performance for the promise of future applications of its feature set. The little extras the Vega architecture has baked into it look like they could be genuinely game-changing…but only if developers end up taking advantage of them. And that’s a big if.

If it was guaranteed that the HBCC and Rapid Packed Math shenanigans were going to be employed across the board, and not just by AMD best-buds Bethesda, then the new Radeon tech would be the one to go for over the old-school Nvidia design. But it’s not a sure thing, and we don’t really know what performance improvements these Vega features might offer either.

Battlefield 4 has shown some impressive Vega performance in the face of the GTX 1080 competition, with AMD’s traditional Vulkan speed wins in the low-level API. But Vega is showing greater gains than the Polaris architecture, which would indicate there is something to the Rapid Packed Math stuff. We’re going to be doing some more investigating on that in the future.

If you were spending around US$400 on a graphics card today – even if there were RX Vega 56’s available at their suggested price – it would still be difficult to make the Radeon recommendation. It’s the more advanced architecture, but in raw performance terms the smaller, slightly cheaper, more efficient Nvidia GPU is likely to get you higher frame rates in more of the games you’re playing now.

If AMD could have priced Vega more aggressively against the still-strong two years old Nvidia Pascal architecture, then it would have a better chance of taking the market by storm, but unfortunately the price of HBM2 is a big sticking point. And with AMD reportedly losing US$100 on each card sold at the suggested price, it looks like it’s done all it can on that front.

We do like what AMD are trying to do with Vega, but with all things being equal, sacrificing current performance for the chance of higher frame rates in a few future games is likely to be too much of a stretch for most PC gamers.

Dominic Chua

Boys just want to play games. And have fun.