The NVIDIA GeForce GTX 1080 Preview: A Look at What's to Come
by Ryan Smith on May 17, 2016 9:00 AM ESTFirst Thoughts
Wrapping up our preview of the GeForce GTX 1080, I think it’s safe to say that NVIDIA intends to start off the 16nm/14nm generation with a bang. As the first high-end card of this generation the GTX 1080 sets new marks for overall performance and for power efficiency, thanks to the combination of TSMC’s 16nm FinFET process and NVIDIA’s Pascal architecture. Translating this into numbers, at 4K we’re looking at 30% performance gain versus the GTX 980 Ti and a 70% performance gain over the GTX 980, amounting to a very significant jump in efficiency and performance over the Maxwell generation.
Looking at the bigger picture, as the first vendor to launch their 16nm/14nm flagship card, NVIDIA will get to enjoy the first mover’s advantage both with respect to setting performance expectations and with pricing. The GeForce GTX 1080 will keep the performance crown solidly in NVIDIA’s hands, and with it control of the high-end video card market for some time to come. NVIDIA’s loyal opposition, AMD’s Radeon Technologies Group, has strongly hinted that they’re not going to be releasing comparable high-performance video cards in the near future. Rather the company is looking to make a run at the much larger mainstream market for desktops and laptops with their Polaris architecture, something that GP104 isn’t meant to address.
The lack of competition at the high-end means that for the time being NVIDIA can price the GTX 1080 at what the market will bear, and this is more or less what we’re looking at for NVIDIA’s new card. While the formal MSRP on the GTX 1080 is $599 – $50 over what the GTX 980 launched at – that price is the starting price for custom cards from NVIDIA’s partners. The reference card as we’ve previewed it today – what NVIDIA is calling the Founders Edition card – carries a $100 premium over that, pushing it to $699.
GeForce GTX 1080 Configurations | ||||
Base | Founders Edition | |||
Core Clock | 1607MHz | 1607MHz | ||
Boost Clock | 1733MHz | 1733MHz | ||
Memory Clock | 10Gbps GDDR5X | 10Gbps GDDR5X | ||
Cooler | Manufacturer Custom (Typical: 2 or 3 Fan Open Air) |
NVIDIA Reference (Blower w/Vapor Chamber) |
||
Availability Date | June 2016? | 05/27/2016 | ||
Price | Starting at $599 | $699 |
While the differences between the reference and custom cards will be a longer subject for our full review, the more immediate ramification is going to be that only the Founders Edition cards are guaranteed to be available at launch. NVIDIA can’t speak definitively for their board partners, but at this point I am not seriously expecting custom cards until June. And this means that if you want one of the first GTX 1080s, then you’re going to have to pay $699 for the Founders Edition card. Which is not to say that it’s a bad card – far from it, it’s probably NVIDIA’s finest reference card to date – however it pushes the card’s price north of 980 Ti territory, some $150 higher than where the GTX 980 launched in 2014. For those who can afford such a card they will not be disappointed, but it’s definitely less affordable than past NVIDIA x80 cards.
Anyhow, we’ll be back later this week with our full review of the GeForce GTX 1080, so be sure to stay tuned.
Spring 2016 GPU Pricing Comparison | |||||
AMD | Price | NVIDIA | |||
$699 | GeForce GTX 1080 FE | ||||
Radeon R9 Fury X | $609 | ||||
$589 | GeForce GTX 980 Ti | ||||
$429 | GeForce GTX 980 | ||||
Radeon R9 390X | $399 | ||||
Radeon R9 390 | $289 | GeForce GTX 970 |
262 Comments
View All Comments
QinX - Tuesday, May 17, 2016 - link
Thanks for the explanation, I was worried that support for older games was already going down.Badelhas - Tuesday, May 17, 2016 - link
What about including tht HTC Vive on your benchmarks? If you talk about the VR benefits, you have to show them in graphs, it´s you speciality AnadTech! ;)JeffFlanagan - Tuesday, May 17, 2016 - link
Seconded. At this point VR gaming is much more interesting to me than even 4K gaming, and will drive my video card upgrades from now on. It's really nice to be able to play a game like it's the real world, rather than using a controller and looking at a screen.MFK - Tuesday, May 17, 2016 - link
Completely agreed.I'm a casual gamer, and my i5-2500k + GTX760 serve me perfectly fine.
I have a 1440p monitor but I reduce the resolution to 1080 or 720 demanding on how demanding the game is.
My upgrade will be determined and driven by VR. Whoever manages to deliver acceptable VR performance in a reasonable price will get my $.
And they will be competing in price and content against the PS4k + Move + Morpheus combo.
Ryan Smith - Tuesday, May 17, 2016 - link
It's in the works, though there's an issue with how many games can be properly tested in VR mode without a headset attached.haplo602 - Tuesday, May 17, 2016 - link
It will be interesting how much GDDR5X affects the scores vs GDDR5. 1080 vs 1070 will be very telling or in the alternative a downclocked 1080 vs a 980 Ti ....fanofanand - Tuesday, May 17, 2016 - link
excellent preview, little typo here.Translating this into numbers, at 4K we’re looking at 30% performance gain versus the GTX 980 and a 70% performance gain over the GTX 980, amounting to a very significant jump in efficiency and performance over the Maxwell generation. That durn GTX 980 is just all over the board!
tipoo - Tuesday, May 17, 2016 - link
How does Pascal do on async compute? I know that was the big bugbear with Maxwell, with Nvidia promising it but it looking like they were doing it in CPU for scheduling, not GPU like GCN.http://www.extremetech.com/extreme/213519-asynchro...
https://forum.beyond3d.com/threads/dx12-performanc...
Stuka87 - Tuesday, May 17, 2016 - link
I do find it a bit annoying that you guys are still using a junk reference 290X instead of a properly cooled 390X.TheinsanegamerN - Tuesday, May 17, 2016 - link
That's what AMD provided. A custom cooled nvidia 980ti will perform better then the stock model, yet people dont complain about that.When anand DID use a third party card (460s IIRC) there was a massive backlash from the community saying they were 'unfair' in their reviews. So now they just use stock cards. Blame AMD for dropping the ball on that one.