Friday , November 15 2019
Home / australia / Nvidia's RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared

Nvidia's RTX 2080 Ti, RTX 2080 And GTX 1080 Ti Compared



Ray tracing is all well and good, but how many frames for you to get from Nvidia's new RTX 2080 Ti and RTX 2080 cards?

After the company's biggest launch at Gamescom this year, I have been slowly working my way through the tests with what most people would consider the three flagship gaming cards: founders editions of the RTX 2080 Ti and RTX 2080, which cost $ 1899 and $ 1299 respectively . I've been running those against a GTX 1080 Ti, the previous flagship gaming, which you can grab locally between $ 1150 and $ 1250 depending on brand.


Nvidia's Pitch Part: Games Can Get Better Looking Over Time

About journalists, YouTubers and other tech media had just sat through about three hours of dense presentations. The graphics card is in exhausting details.

Read more

But when the cards were launched, there were two parts missing: games that supported ray tracing, and all-too-crucial features of Windows update enabling tracing within DirectX 12. That Windows Update turned out to be a horror show for Microsoft, with the company has to roll back the update not once, but twice, for system stability.

So when you take the tracing out of the equation – which will I do for the article – how do I do the RTX cards stack up against modern games? Nvidia released some preliminary graphs and slides showing that the RTX 2080 could handle games like HITMAN and Final Fantasy XV over (or at) 60fps at 4K and HDR.


Some Benchmarks (Kind Of) From Nvidia's RTX 2080

When Nvidia launched their RTX 20 series prior to Gamescom kicking off, there was a notable element missing: benchmarks. Specifically video game benchmarks, a reliable go-to for people assessing the worthiness of a new GPU. Following a closed-door session with the press, the GPU maker released some more figures about how their cards performed in the real world. Sort of.

Read more

But in the real world, with real world drivers, released games and publicly available synthetic tests, how far does your $ 1,300 or $ 1,900 go?

Before we get into the tests, here is the system used. It's what most people would consider a good gaming rig, but it's not the best. The weakness especially is the 7900X CPU, a 10-core Intel offering. It plays games fine, but it won't get the results as a six or eight-core offering that can run a higher clock speed (like the recently released i9-9900K, or the popular i7-8700K with less physical cores but a higher turbo clock speed than the other chips). Keep that in mind as you digest the results below.

Benchmark System:

  • CPU: Intel i7-7900X (stock speeds)
  • RAM: 32GB DDR4 3200MHz G-Skill TridentZ RGB RAM
  • GPUs: GTX 1080 Ti Founders Edition / RTX 2080 / RTX 2080 Ti Founders Edition
  • Motherboard: Gigabyte AORUS Gaming 7
  • Monitors: Acer X27 4K HDR 144Hz / EIZO 23.5 "240Hz VA monitors
  • PSU: EVGA Supernova G2 850W
  • GPU Drivers: 416.16 (October 4 2018)

Many thanks to Nvidia for supplying the Acer X27 Predator screen for this testing.

For clarity: the 7900X is running on a stock clock on Corsair H100i liquid coolers, while the RAM is running at 14-14-14-34-1.35V (confirmed with CPU-Z). G-SYNC was disabled for all tests, and the GPU was set to the Maximum Performance in the Nvidia Control Panel.

The tests and games used were:

  • 3D Mark (Fire Strike, Fire Strike Ultra, Fire Strike Extreme)
  • Forza Horizon 4 (DX12)
  • Total War: Warhammer 2 (DX11)
  • Shadow of the Tomb Raider (DX12)
  • Middle-earth: Shadow of War

At the time of writing, the Final Fantasy XV The benchmark DLSS was available privately but not publicly. It's since been released publicly, but DLSS support for nots (and maybe not ever) has been patched into the full game. I'll also be running some 4K specific tests with RTX enabled games and other recent AAA titles, like Battlefield 5, on newer drivers at a later date.

As for the games chosen, I opted for this mix because it runs off a variety of engines. There's a variety of DX11 and DX12 usage across the board – some support both like Warhammer 2 and Shadow of the Tomb Raider – and each game is built using an in-house engine. Almost all of the games are well optimized, with the exception of Total Warhammer 2. Creative Assembly's Warhammer RTS is more CPU-bound, but it's also the type of game that attracts gamers who spend more on their PCs than most, and I've kept it in rotation for now.

Because of the time and year of my general workload, I wasn't able to broaden the testing of Ubisoft title or an Unreal Engine-based game, and Battlefield 5 wasn't available at the time I ran these tests. I'll be looking to run more coverage featuring some of those games soon though.

All games were tested across 1080p, 1440p and 4K using the highest presets available in each game. 3D Mark has no presets, but different tests run at different rendering resolutions, you get the same effect.

All tests were also run several times, with results outliers discarded. Some tests are more consistent than others – Shadow of War tends to return similar results whether you run it 17 times or 70 – but this is done to help avoid issues of variance. I also disabled automatic updates where possible on each of the games (which is easy to do for games running through Steam) to avoid inconsistencies with future performance improvements.

This is beneficial to Shadow of the Tomb Raider: A future update is caused by stability issues for Nvidia owners, resulting in complaining about memory errors, which I discovered after I patched the game post-testing. Fortunately, Square Enix allows people to roll back their games to older versions through Steam's beta settings, the developers should consider supporting.

One important factor, and one I'll explain after the results of these tests were run without HDR. I'll get into that at the end, however. Dynamic resolution was also manually disabled in games where options are to ensure consistency.

Let's begin with the synthetic figures. Clicking or press on the benchmark graph below if you need to expand them for readability. All figures are reported in average frames per second.

3D Mark Fire Strike

3D Mark is the standard synthetic test when it comes to gaming rigs. Split into multiple tests that stress different parts of the system and GPU before a combined fighting test.

The RTX 2080 The king of the pack here, and it remains that way for the remainder of the tests. The advantage that the Ti GTX 1080 has over the RTX 2080 – which will be a lot of neck and neck for the results – is the extra 3GB of VRAM, slightly more memory bandwidth (484GB / s versus the RTX 2080's 448GB / s) and a wider memory bus.

The RTX 2080 is higher clocked, though, and the majority of instances are out ahead of the GTX 1080 while having the bonus hardware for futureproofing. The Fire Strike tests, however, are one area where they fell just behind. But it's very fractional, and within the margin of error. It's also common to see incremental jumps with future drivers too, so keep that in mind as we move forward.

Shadow of the Tomb Raider

Lara's latest adventure was scheduled to be one of the first games with ray traced shadows, and it looked like a treat over at Gamescom. Shadow it hasn't been updated with ray tracing shadows at the time of testing, but the in-game benchmark was improved over the one that shipped in Rise of the Tomb Raider, offering a more representative recreation of in-game performance as Lara Traverses through a town center and jungles.

When Nvidia Proclaims that the "4K 60fps" dream has been realized, this is generally the kind of result they're talking about. The frame rate is dipped below the 60fps water line on the ultra preset, but it's reminded everyone: the 7900X is not the best gaming CPU around. The tests were run with an i7-8700K, one of the newer i9 CPUs, or the Stellar Ryzen 7 2700X all-rounder, the 2080 would have more headroom at Ultra settings.

As for the RTX 2080 and GTX 1080 Ti, I'd actually consider just staying at 1440p. Having more overhead is important for the most intensive scenes, which you have to factor in when comparing benchmarks and just the general nature of gameplay. A solid 60fps when the sun is setting over the horizon is nice. A solid 60fps in the heat of battle is much, much better.

Middle-earth: Shadow of War

The Monolith's orc-slaying / dominating simulator can be quite the look when all the textures are bumped up to their highest. It's also a fun game in its own right, especially now that the elements have been patched out and some solid expansions have been released.

Shadow of War has an in-built benchmark that runs through a single unbroken scene, flying through some vegetation before diving into a blood castle battle and approaching an orc boss in full chain armor. The game also supports HDR, provided you have enabled the requisite settings in your Windows display settings.

A well-optimized game, and one should have no problem enjoying at 4K. The RTX 2080 Ti has far and away the most headroom, although it is a game that is at High settings, and that is a smooth goal of 144fps (reachable for high refresh gaming monitors) is within reach for all of the flagships. here.

Chain armor does look real nice at 4K, though. I'm looking forward to replacing this later when I have a bit of time off.

Forza Horizon 4

It hasn't gotten quite as much as it should have, but holy shit is Forza Horizon 4 well optimized. It's almost at DOOM levels of performance for how well three cards are, and I would expect similarly great results for users with the RTX 2070, GTX 1070, and AMD cards too (given that it would have a lot of experience optimizing for AMD hardware in the Xbox One X).

Better: Forza Horizon 4 has one of the best in-game benchmarks, replicating a short race with AI which isn't too dissimilar from actual gameplay. And it's a great show for just how well all three of Nvidia's cards perform: all three are capable of maintaining well above 60fps at 4K, at any preset.

The gap between the GTX 1080 Ti and RTX 2080 narrows as Forza Horizon 4 eats up more VRAM, which is expected when the resolution starts picking up. It's also a good reminder of the frame rate hit separating the highest possible presets from the second or third-best option.

A game running at 4K on High is going to look better than 1440p on the Ultra preset – you're getting sharper textures, anti-aliasing algorithms don't have to work as hard and clarity will be smooth since you're playing at that screen's native resolution, assuming you're playing on a 4K screen.

And even then, I'd still recommend downsamp when the results are this good.

Total War: Warhammer 2

An old favorite, Warhammer Total are the CPUs that have tons of tons onto the battlefields, effects and spells decimate the land. For these tests, I've used the heavier Skaven battle benchmark, rather than the original battle or campaign benchmarks.

Warhammer 2 DX11 supports with "beta" support for DX12, although Nvidia cards is better at performance than the DX11 mode, so I've left it at that.

You can see the obvious limitation here throughout the results: it's the CPU, not the GPU, which helps explain why the Ultra setting results in basically no difference in performance between 1080p and 1440p for all three cards. Things change a little once Warhammer 2 starts eating up more VRAM at 4K, but in general the poorer performance here is a level of optimization that's just not as refined as other titles.

That's to be expected: this is the game in this lineup, and I'll be to see what improvements Creative Assembly make with the next Total War games, particularly to their DirectX 12 implementation. There are a lot of multi-threading benefits within DX12 that would be a natural fit for Total War games, so we'll have to sit tight until Total Warhammer 3 rolls around.


A Word on HDR

Gaming HDR has been possible for the latest GPUs. Support was enabled for the 900 series GeForce GPUs, HDMI through HDMI, while every AMD cards from the R9 380 and RX 460 have supported HDR through DisplayPort and HDMI. It's slightly trickier if you have a G-Sync monitor: only the GTX 1050 series or higher is supported.

Support for HDR among PC games is becoming more standardized among the AAA games, especially since many of them have been working on their preferred HDR implementation for consoles. Games like Destiny 2, Battlefield 1, the latest Assassin's Creed games, ARK: Survival Evolved and HITMAN are just some of the titles with HDR support. In the tests above, Shadow of War, Forza Horizon 4 and Shadow of the Tomb Raider all support HDR, while Total War: Warhammer 2 does not.

So, you might ask: why not test everything in HDR?

The reasons are twofold. Firstly, the vast majority of PC still do not own a primary or secondary monitor that supports HDR. The preference is still very much for monitors with a high refresh rate, or a higher color reproduction, than a monitor that can do HDR. Monitors that support all of these things – like the Acer X27 Predator which Nvidia supplied for testing – are extraordinarily expensive. The Acer X27 supports G-Sync, 144hz, HDR and 4K will set you back $ 2800 at the time of writing, or $ 3500 if you want the ASUS ROG Swift 27 "screen.

If you want a 4K screen, 144hz support, you're looking at around $ 770. But the best models were almost a decade ago, I'm not going to argue against owning one.

HDR panels have taken a while to disseminate between PC gaming, primarily because the manufacturers have concentrated on other ends of the market: smaller screens for phones, and larger displays for TVs. PC monitors are a smaller market with less profit margins than either of those two extremes, and as a result many PC gamers are still making do without.

The other roadblocks in the way of HDR are Windows. Support for HDR in Windows has not been fantastic over the last 12 months, while this year's April update has improved how Windows handles SDR content, it's still pretty awful. Non-SDR content still looks washed out, and you have different HDR implementations to deal with: some games support Dolby Vision, others just support HDR10, and others have sliders to allow you to adjust the luminance so your eyes don't bleed out .

But I in the run a short batch of tests just to illustrate one thing: the lack of performance difference between HDR and non-HDR. The GTX 10 series has supported HDR, but it's always come at a slight performance hit. That's still a little reduced testing, but for the most part if you want to run a game in HDR, and can be a comfortable and comfortable point, performance shouldn't be a problem.

Before we get into the final nitty gritty, and dissect the prices of all these cards, we need to talk about: AI.

Deep Learning Super Sampling (DLSS)

The range of AI-powered technology in RTX cards, especially the updates being made to Ansel, are rather cool. But it's all about them, it's DLSS, Nvidia's neural network powered anti-aliasing technique, that will have the most performance impact for now.

The time of writing two synthetic tests was available, but the tests only worked with Nvidia's RTX cards. One of them is a 3D Mark-style test, Epic's Infiltrator demo. You can see a video of that running from Guru3D on YouTube below, give you an indication of what we're talking about:

The second was a separate build of the Final Fantasy XV benchmark that supported DLSS. You can get the benchmark for yourself, with or without DLSS, through the FFXV site here.

At the time of writing, this is the closest we have to approximating the performance benefits with DLSS. That said, there are some strong arguments why it shouldn't be considered in testing.

When it was released, the FFXV The benchmark was released in the year. The general gist of the problem was that the benchmark was improvised rendering objects and models well and the Square was admitted on the Square, admitted on February 6 that the benchmark was beset with stuttering and level of detail problems that "will be addressed in the shipping game".

For the most part, those issues were addressed in the final PC release. The benchmark is what kind of makes all of this moot.

So while the FFXV The benchmark does showcase improvements when DLSS is enabled, it's really, really flawed the benchmark. It is still an arbitrary score, rather than standardized metrics that fall in line with any other reports, and the authorized issue makes it too reliable to have comfort in using it as a real-world performance.

Having seen DLSS in action at Gamescom earlier this year, I'm still very hopeful that it'll be a performance for RTX owners when it starts to roll out in games. I just don't think of it FFXV benchmark meets the standard, and with development on the PC version of FFXV having been canceled, it seems unlikely that DLSS will ever be implemented into the full game. I think it's still worth seeing how FFXV handles at 4K, particularly given that Nvidia helped to develop the PC version before release, but that's for a future article.


Whichever of the three flagship GPUs you go for, you're going to be spending at least $ 1150. The RTX 20000 is the one that is the same as the GTX 1080 Ti, which neuters some of the value seen overseas where pricing on the GTX 1080 has become rather competitive.

More importantly, the stock of the RTX 2080 is more broadly available. I've even seen instances – albeit limited – of the RTX 2080 being priced under $ 1150, although you'll have to buy through Newegg for that.

But for someone buying today, someone genuinely considers the investment in a card that will last them at least three years, I would consider this.

There's a much stronger value proposition that can run the block game games than the highest settings – with overhead – that will over spend a grand for a card that will mostly get you there. When you factor in natural depreciation of technology and ray tracing in particular it becomes more popular – not only does investing in this space – someone with an 8th-gen Intel gaming rig or a 2nd-gen Ryzen setup is going to get more mileage out of the RTX 2080 Ti, which has no problems at 1440p and even a couple of years from now.

That's the best way to think about these cards. How much are you looking to invest over the course of the next few years? It's one thing to spend $ 600 or $ 700 on a video card now. But you have to think about how to make it work again, and you will be able to upgrade again, and where to get the best mileage from the rest of the system.

If money was no object, I had already had a reasonable system limited by GPU that was a generation of old people still on the 900 series GPUs, or perhaps making the most out of AMD RX 480 or R9 390X – the GTX 2080 Ti offers a substantial upgrade in the performance that will hang around for years.

If we have a pure value proposition of what you can buy today, the RTX 2080 offers better value for Australians. That's not the case for overseas where the stock of the GTX 1080 Ti is more readily available, and more competitively priced, but you can only play the cards you're dealing with. Besides, that is a better situation for gamers: the more modern technology is on par, if not slightly better than the 1080 the bar is a slight reductions in memory bandwidth and VRAM, and the benefits of upgrading to the NVENC encoder (which streamers will enjoy ), dedicated RT and tensor cores for ray tracing and AI, that's more energy efficient.

But there are many things that have to be done about it. It doesn't factor in, for instances, whether a $ 500 or $ 600 investment now (with a view to buying the second generation of RT cards in two or three years) is better value. Or what impact of AMD's 7nm cards will have next year.

And it's AMD's presence presence that could ultimately end up strengthening the argument for the RTX cards, especially if AMD follows suit by supporting real-time ray tracing in convincing fashion. Even if the performance doesn't match up to Nvidia – and previous experience, it won't happen at least – the support of both manufacturers will help increase developer support down the road.

And then there is a future downward pressure on prices to consider.

So I'll leave it at this. If you're in the fortunate financial position to consider purchasing any one of these cards, the raw value proposition is less concern, then you might as well go all out. The RTX 2080 Ti is a fantastic card, with enough overhead at full settings to please. If you're loaded, you won't be disappointed – at least not in the raw performance. Ray Tracing is another issue, although the ongoing night of Windows doesn't help there.

If you're after an almost top-of-the-line upgrade, but aren't sure if the GTX 1080 is a better buy, the RTX 2080 is the better choice. It's evenly priced, more supply is available locally, and you'll have some future proof for the next couple of years, once the developers become accustomed to ray tracing and AI-powered tech in general. I would like to do it for people who do a lot of Adobe work or professional rendering – the extra VRAM and CUDA cores in the GTX 1080 Ti might be more useful, have not sacrificed much in gaming performance . But that's a moot point if supply remains limited.

If they are first dropped: carry on as you were. They're the best cards on the market, but hardly the most affordable.

Gamers who grew up poor and played with a lot of aging systems (courtesy of local banks who didn't want them, or know what to do with them), I would always lean towards the best bang for buck. Once Nvidia has more competition in the market and prices, it has a lot of competition, starting with fall below four digits. An RTX 2080 around the $ 800 or $ 900 mark isn't a price to baulk at.

That said, there is always going to be that gamers who have the money to splurge today. And for that person who buys the RTX 2080 Ti?

Just make sure you have a nice screen to go with it.


Source link