Radeon RX Vega Unveiled: AMD Announces $499 RX Vega 64 & $399 RX Vega 56, Launching August 14thby Ryan Smith on July 30, 2017 10:30 PM EST
At this point, one must give credit to AMD for their marketing program for the Radeon RX Vega. The company has opted to drip feed information over many months, and as a result this has kept the public interested in the architecture and consumer RX Vega cards. Since it was by name back in the spring of 2016, we’ve had architecture previews, product teasers, and even a new Frontier Editions to tide us over. Suffice it to say, there’s a great deal of fascination in finally seeing the products AMD has been beating the drums about for so long.
To that end, there’s good news today and there’s bad news today. In the interest of expediency, I may as well start with the bad news: today is not the launch day for the Radeon RX Vega. In fact, only right before this embargo expired did AMD even announce a launch date: August 14th. So for reviews, performance analyses, and of course purchasing, everyone will have to hold on just a bit longer.
The good news then is that even if today isn’t the Radeon RX Vega launch, AMD is finally making significant progress towards it by announcing the cards, the specifications, and the pricing. Gamers may not be able to buy the cards quite yet, but everyone is going to have some time to size up the situation before the proper launch of the cards next month. Overall this situation is very similar to the unveiling of the Radeon R9 290 series, where AMD announced the cards at a product showcase before launching them the following month.
So without further ado, let’s dive into the Radeon RX Vega family of cards and their specifications.
|AMD Radeon RX Series Specification Comparison|
|AMD Radeon RX Vega 64 Liquid||AMD Radeon RX Vega 64||AMD Radeon RX Vega 56||AMD Radeon R9 Fury X|
|Memory Clock||1.89Gbps HBM2||1.89Gbps HBM2||1.6Gbps HBM2||1Gbps HBM|
|Memory Bus Width||2048-bit||2048-bit||2048-bit||4096-bit|
|Manufacturing Process||GloFo 14nm||GloFo 14nm||GloFo 14nm||TSMC 28nm|
|Architecture||GCN 5||GCN 5||GCN 5||GCN 3|
|GPU||Vega 10||Vega 10||Vega 10||Fiji|
All told, AMD will be releasing 3 different RX Vega cards. All 3 cards are based on the same GPU, Vega 10, which powers the already released Radeon Vega Frontier Edition. So if you’re familiar with that card, then you should have an idea of what to expect here.
The top of AMD’s lineup is the Radeon RX Vega 64 Liquid Cooled Edition. This is a fully enabled Vega 10 card and it has the highest clockspeeds and highest power requirements of the stack. All told, this is 64 CUs, 64 ROPs, boosting to 1677MHz, and paired with 8GB of HBM2 memory clocked at 1.89Gbps. Typical board power for the card is rated at 345W. To cool such a card, you of course will want liquid cooling, and living up to the name the card, AMD has included just that, thanks to a pump and 120mm radiator.
The second member of AMD’s lineup is the shorter-named vanilla Radeon RX Vega 64. Unlike its liquid cooled predecessor, this is a traditional blower-type air cooled card. And for the purposes of AMD’s product stack, the company is treating the vanilla Vega 64 as the “baseline” card for the Vega 64 family. This means that the company’s performance projections are based on this card, and not the higher-clocked liquid cooled card.
The vanilla Vega 64 utilizes the same fully enabled Vega 10 GPU, with 64 CUs and 64 ROPs. The card’s reduced cooling capacity goes hand-in-hand with slightly lower clockspeeds of 1247MHz base and 1546MHz boost. Paired up with the Vega GPU itself is the same 8GB of HBM2 as on the liquid cooled card, still running at 1.89Gbps for 484GB/sec of memory bandwidth. Finally, this card ships with a notably lower TBP than the liquid cooled card, bringing it down by 50W to 295W.
Meanwhile, unlike any of the other cards in the RX Vega family, the Vega 64 will come in two shroud design options. AMD’s reference shroud is a plastic/rubber design similar to what we saw on the reference Radeon RX 480 launched last year. AMD will also have a “limited edition” version of the card with the same hardware specifications, but replacing the rubber shroud with a brushed aluminum shroud, very similar to the one found on the Vega Frontier Edition. Though it’s important to note that the only difference between these two cards is the material of the shroud; the cards are otherwise identical, PCBs, performance, cooling systems, and all.
On that note, AMD has only released a limited amount of information on the cooler design of the Vega 64, which is of particular interest as it’s an area where AMD struggled on the R9 290 and RX 480 series. We do know that the radial fan is larger, now measuring 30mm in radius (60mm in diameter). The fan in turn is responsible for cooling a heatsink that’s attached to the Vega 10 GPU + memory package via a vapor chamber, a typical design choice for high performance, high TDP video cards.
Finally, the last member of the RX Vega family is the Radeon RX Vega 56. The obligatory cut-down member of the group, this card gets a partially disabled version of the Vega 10 GPU with only 56 of 64 CUs enabled. On the clockspeed front, this card also sees reduced GPU and memory clockspeeds; the GPU runs at 1156MHz base and 1471MHz boost, while the HBM2 memory runs at 1.6Gbps (for 410GB/sec of memory bandwidth). Following the traditional cut-down card model, this lower performing card is also lower power – and quite possibly the most power efficient RX Vega card – with a 210W TDP, some 85W below the Vega 64. Meanwhile, other than its clockspeed the card’s HBM2 memory is untouched, shipping with the same 8GB of memory as the other RX Vega members.
Moving on, perhaps the burning question for many readers now that they have the specifications in hand is expected performance, and this is something of a murky area. AMD has published some performance slides for the Vega 64, but they haven’t taken the time to extensively catalog what they see as the competition for the card and where the RX Vega family fits into that. Instead, what we’ve been told is to expect the Vega 64 to “trade blows” with NVIDIA’s GeForce GTX 1080.
In terms of numbers, the few numbers that the company has published have focused on minimum framerates over average framerates, opting to emphasize smoothness and the advantage they believe to have over the aforementioned GTX 1080. As always, competitive numbers should be taken with a (large) grain of salt, but for the time being this is the best guidance we have on what to expect for the RX Vega family’s performance.
Otherwise for the Vega 64 Liquid and Vega 56, we don’t have any other performance figures. Expect the former to outperform the air cooled Vega 64 – though perhaps not massively – while the Vega 56 will come in notably lower.
Post Your CommentPlease log in or sign up to comment.
View All Comments
Sttm - Monday, July 31, 2017 - link295W to match the 180W GTX 1080... THEY ARE A GENERATION BEHIND!
tuxRoller - Monday, July 31, 2017 - linkWow.
This is just awful.
Something must've gone really wrong with either the design or implementation, or both.
There's no good excuse why a card that is running on a two generation newer node AND that has a similar power draw is ever running at fewer fps.
I really hope the story of wtf happened to Vega emerges.
abufrejoval - Monday, July 31, 2017 - linkThe R9 Nano pointed in the right direction for what HBM could do: Less power, noise and size, while beating the R9 290X on performance.
For the VEGA I expected a generational improvement like I got from the GTX 1070 vs. the GTX 980ti: Same performance at half price and power.
345 Watts for the equivalent of a 180 Watt/(R9 Nano level) GTX 1080? Seriously? Did someone swallow a shrink and make HBM2 more energy hungry than GDDR5X?
Even the Infinity Fabric cannot rescue such a Watt/performance blunder...
CiccioB - Tuesday, August 1, 2017 - linkBTW, Infinity Fabric is not a magic component, and for sure nor a energy saver one. On the contrary, it is an added component that need extra energy to solve scaling problems.
So, at the end, a possible MCM GPU will have a lower performance/energy ratio than a monolithic one.
7beauties - Monday, July 31, 2017 - linkI'm pleased as a bowl of rum punch that Dr. Lisa Su is aggressively competing against Intel with Ryzen and now against Nvidia with Vega but to be an AMD fanboy is to suffering wait times of foot-long beards and inches on the waistline. AMD is typically late in unveiling new hardware. I hope that their cadence of tic-toc-toc will at least bring noteworthy enhancements. Good luck AMD. It's great to see spoil Intel's party, as I hope you do with Nvidia's, but it's been a long, long time coming.
spat55 - Tuesday, August 1, 2017 - linkThe biggest issue with them being this late will be nVidia with Volta, even if Vega is decent and slots in between a 1080 & 1080ti it'll soon be beaten by Volta, for those of us who wanted to upgrade we brought a 1070/1080/1080ti and will be waiting for Volta.
CiccioB - Tuesday, August 1, 2017 - link
What many AMD fanboys have still not understood is that you do not compete with anyone if your product costs more (at production) ad is sold at less. Your product is simply under priced to have some appeal, and THAT IS NOT CONCURRENCY!
Vega, as is the entire GCN architecture is simply under performing and needs to be boosted in clocks a lot outside of their optimal energy efficient point without still reaching a decent performance against the lower tier concurrent solution.
This architecture has to be scrapped and a new one must take its place as soon as possible, or will we wait for the eternal savior (it was Tahiti at the beginning, which soon showed it was too big too power hyngry, then it was Fiji, then Pascal, then Vega.. next is Navi... we will ever see an architecture performing better than the nvidia ones without using tons more of silicon and watts, so being really competitive?
Outlander_04 - Tuesday, August 1, 2017 - linkYou must be one of the very few who have actually used a Vega card? No?
So its almost like you are just making stuff up?
CiccioB - Thursday, August 3, 2017 - linkHahahahah.
With 490mmq, HBM2 and 300W it should leave GP102 in the dust, not trading blow with its lower tier cousin, the GP104, released 14 months ago.
Are you kidding when talking about how good it GCN? Have you still not understood how bad it is? What do you need to understand it? Well, possibly Volta will teach you how good it GCN, when the x80 series will sell for $600.
And, yes, if you expressly code for GCN (like DICE did) it will gain some points. As for any other architecture. Which does not cancel the poor area*power/performace GCN has.
fanofanand - Wednesday, August 2, 2017 - linkPascal was Nvidia, just an FYI. Otherwise nice rant.