Developer Blog 8: Windows 10 Performance Guide
Gears of War 4 is out there. It's a crazy thought for all of us at the studio, as fans around the world begin getting hands on with Gears of War 4: Ultimate Edition Early Access. We've read the reviews, watched your streams, laughed at your reaction videos and eagerly been reading your comments. Thank you for sharing your experience with our game with us - it's why we do this.
Some of you are playing on Xbox. Others on PC. Maybe even both. But it's time we did a Developer Blog dedicated to the game on Windows 10. Welcome to a special Developer Blog, featuring an extensive performance guide for Gears of War 4 on Windows 10. Take it away Cam.
Gears of War 4 Performance Guide
Hey everyone! My name’s Cam McRae, and I’m the Technical Director for Windows 10. Our team has spent a lot of time listening to your feedback on Gears of War: Ultimate Edition, and we’ve worked hard to make the PC edition of Gears of War 4 a premium experience for PC gamers.
With the release of Gears of War 4 on PC just around the corner – October 7 for Ultimate Edition buyers, and October 11 for standard edition, many people may be wondering “Will my machine be able to run Gears 4?”. We’ve tested a variety of setups in Gears of War 4, from the low end to the high end and everything in between, so we wanted to put together a short guide to showcase the performance you can expect on day one across a variety of video cards.
First, here are the minimum, recommended and ideal specs:
It’s helpful to understand resolution and performance targets to go along with each spec.
- Minimum: 1920x1080 @ 45 FPS
- Recommended: 2560x1440 @ 60 FPS
- Ideal: 3840x2160 @ 30 FPS
The minimum spec for Gears of War 4 is our target for a well performing game with good visual quality. With that in mind, the minimum spec does not use the lowest settings, leaving room for less powerful systems to still play the game.
Gears of War 4 comes with a built-in benchmark to help you tune your visual settings and see the expected performance in campaign gameplay. The benchmark is made of two scenes: the first is an average scene focusing on combat, the second is a heavier scene highlighting destruction, physics, combat and lots of visual effects.
The benchmark runs for one minute and then presents you with a variety of information to understand how the game performs with the currently selected settings. Here’s a breakdown of that information:
- Hardware Configuration
This section details the GPU, CPU, VRAM, and total system RAM on the computer. Note that the VRAM section displays the total VRAM on the card as well as the amount of available VRAM to the game. The OS always keeps a reservation for other apps running on your computer that is not available to the game.
- Visual Settings
These are the visual settings used on this run of the benchmark. There are a few settings not represented due to space constraints, but those settings typically have a minor impact on performance.
There are five important stats from the benchmark displayed here:
Average GPU Framerate: The average framerate achieved by your video card.
Average CPU Framerate (game): The average framerate the game thread can achieve. The game thread encompasses all work needed to simulate one frame. The work is spread across multiple cores using a thread pooling system.
Average CPU Framerate (render): The average framerate the render thread can achieve. The render thread encompasses all work needed to translate the simulation data to visual data that the GPU will process. The work is spread across multiple cores using a thread pooling system.
Average Minimum Framerate (bottom 5%): The average of framerate of the slowest 5% of frames. You can use this as an indicator of your lowest possible performance.
GPU Bound: This bar ranges from 0% to 100% and indicates if the game is CPU bound (0% - bad) or GPU bound (100% - good). We generally want to see the game GPU bound because there are many more options to scale performance on the GPU. The CPU is busy doing important work that can’t be scaled as easily. As you achieve higher and higher framerates, the game will inevitably become CPU bound. When this happens the threads will not have a chance to give up time to other tasks, which can cause some longer running tasks (texture loading and decompression, for example) to be slowed down. We do have settings available that will reduce some load on the CPU if you find yourself CPU bound.
- Benchmark Graph
This point graph has a point drawn for every frame processed by the GPU and two main CPU threads (game and render). You can use it to visualize performance over the entire benchmark and to understand problems you may be having with frame drops (more on that later).
Gears of War 4 has over 30 settings at your command to tweak and tune your performance and visual quality as you see fit. Or, you can just stick with our Recommended settings that are auto detected based on your hardware to get the best mix of performance and visual quality.
Advanced Video Settings
We have added an additional quality level we call “Insane” on two settings: Screen Space Reflections and Depth of Field. We consider these to be “Next Gen” quality levels which, so far, have generally only been showcased in tech demos. The game will not auto-select these quality levels, you need to specifically turn them on and they can have quite a high performance cost. However, today’s high end GPU’s can utilize them at 1080p while still maintaining a good framerate.
NVIDIA has put together an outstanding guide that covers the settings in Gears of War 4 in detail, which you can read here: https://www.geforce.com/whats-new/guides/gears-of-war-4-graphics-and-performance-guide
For the purposes of this guide, we are using a consistent set of base hardware. We’ve decided to use our recommended spec CPU to highlight the performance we expect most gamers will be able to experience. This does mean higher end GPU’s will become CPU bound in these tests, limiting their framerate. We expect other benchmarks to really highlight what the game is capable of with the best hardware available. Gears of War 4 is highly asynchronous and will scale across many cores.
- CPU: Intel i5 4690K @ 3.5Ghz
- Motherboard: MSI Z97 G45 Gaming
- Memory: 8GB DDR3-1600
- Hard Drive: Intel SSD 520 Series 240GB
- Windows: Windows 10 Enterprise 14393.222 64 bit
- Game: Retail 22.214.171.124 (Downloaded and installed through the Windows Store)
- NVIDIA Driver: 373.02 (9/30/2016)
- AMD Driver: 16.9.2 (9/20/2016)
NVIDIA and AMD both have Game Ready drivers available at the time of posting. Make sure you go get them here for the best Gears of War 4 experience:
NVIDIA (driver version 373.06)
AMD (driver version 16.10.1)
NVIDIA Video Cards Benchmarked
- GeForce 670
- GeForce 680
- GeForce 750 Ti
- GeForce 770
- GeForce 780
- GeForce 960
- GeForce 970
- GeForce 980
- GeForce 980 Ti
- GeForce 1060 (3GB)
- GeForce 1060 (6GB)
- GeForce 1070
- GeForce 1080
AMD Video Cards Benchmarked
- Radeon HD 7850
- Radeon HD 7970
- Radeon R7 260X
- Radeon R9 270
- Radeon R9 280
- Radeon R9 290
- Radeon R9 380
- Radeon R9 390
- Radeon R9 390X
- Radeon RX 460
- Radeon RX 470
- Radeon RX 480 (8GB)
- Radeon R9 Fury X
Gears of War 4 runs an analysis of your GPU on first launch to determine the Recommended settings for the best mix of performance and visual quality. Recommended settings are not resolution aware, so do not scale as you change scaled resolution values.
We’ll start by looking at performance for each spec at each card’s recommended settings and then show comparisons at Recommended settings and Ultra settings for a consistent baseline. The only change we’ll make to the Recommended settings is to disable Vertical Sync, which allows the GPU to render without waiting for the monitor synchronization.
Let’s start with the minimum spec for NVIDIA and AMD.
Figure 1 – GeForce 750 Ti with Recommended Settings @ 1080p.
The 750 Ti averages 46.5 FPS. It hovers near 60 FPS on average scenes and drops down to ~38 FPS on a heavier scene.
Figure 2 – Radeon R7 260X with Recommended Settings @ 1080p.
The R7 260X averages 48.1 FPS. It has a very similar performance profile to the 750 Ti, dropping down to ~38 FPS on a heavy scene.
As mentioned previously, the recommended spec targets 2560x1440 resolution. Let’s look at the popular GeForce 970 first.
Figure 3 – GeForce 970 with Recommended Settings @ 1440p.
The 970 manages to hang on to an average framerate of 60.8 FPS. In average scenes it is generally above 60 FPS, but will dip down below to ~55 FPS on a heavy scene.
How does it fare at the more popular 1920x1080 resolution?
Figure 4 – GeForce 970 with Recommended Settings @ 1080p.
An average of 89.9 FPS with the bottom 5% of frames averaging out to 64.6 FPS. Here we can start to see the GPU outperforming the CPU in some cases with about 6% of frames being bound to the CPU.
Let’s compare to our other recommended NVIDIA card, the GeForce 1060 (6GB).
Figure 5 – GeForce 1060 with Recommended Settings @ 1440p.
The 1060 edges out the 970 with an average of 63.7 FPS, but it does have a slightly lower bottom 5%. This is due to the 1060’s 6GB of VRAM allowing it to load Ultra detail textures. Compared to the 970, the 1060 has loaded almost 2GB more texture data into VRAM. On the 4 core i5 this can result in a performance hit from the extra loading and decompression work.
This is also the first NVIDIA card where we can see Async Compute enabled. Async Compute work in Gears of War 4 generally has a modest 2-5% gain, depending on resolution. The setting is not applicable to NVIDIA cards prior to the Pascal line.
Now let’s take a look at our recommendations for AMD:
Figure 6 – Radeon R9 290 with Recommended Settings @ 1440p.
While we recommend the R9 290X, we’ve gone with the R9 290 here. With an average framerate of 62.1 FPS, the R9 290 has a slight edge on the 970 and is a just a little shy of the 1060, though it has a slight hit to visual fidelity with some settings moving from ultra to high. With the 5-10% performance gain moving to the 290X you’re looking at great performance and visual quality from our recommended AMD card.
We also recommend the new Radeon RX 480.
Figure 7 – Radeon RX 480 (8GB) with Recommended Settings @ 1440p.
With 8GB of VRAM, the 480 is a step ahead of the 290 in texture fidelity and receives a boost in other settings as well, putting it visually ahead of the GeForce 970 with nearly identical performance. It is just shy of the comparable GeForce 1060 (6GB), with an average of 60.6 FPS.
In summary, you’re looking at a 60 FPS experience at 1440p and closer to 85-90 FPS at 1080p with our recommended set of cards.
The ideal spec targets two things: 4K resolution and Ultra settings. To achieve 4K resolution, a GPU will require at least 4GB of VRAM.
Starting with the GeForce 980 Ti:
Figure 8 – GeForce 980 Ti with Recommended Settings @ 2160p (4K).
The 980 Ti is a workhorse, with 6GB of VRAM pushing it into the range of Ultra textures, it will give you a dependable 36.5 FPS average at 4K, rarely dropping below 30. Let’s compare AMD’s offering in the Fury X.
Figure 9 – Radeon R9 Fury X with Recommended Settings @ 2160p (4K).
The Fury X edges out the 980 Ti at 4K with an average of 39.3 FPS. Like the 980 Ti, it rarely drops below 30 FPS. However, with only 4GB of VRAM it is not able to load the Ultra quality textures.
Finally, let’s take a look at the powerful GeForce 1080.
Figure 10 – GeForce 1080 with Recommended Settings @ 2160p (4K).
The GeForce 1080 handles 4K well with an average of 45 FPS and with the bottom 5% of frames averaging similar to the average framerate of the 980 Ti and the Fury X. However, with all Ultra settings, even this GPU won’t hit 60 FPS at 4K (in Campaign). This is only possible with the NVIDIA Titan X right now. For 60 FPS at 4K on the 1080, try using High settings:
Figure 11 – GeForce 1080 with High Settings @ 2160p (4K).
Dropping to the High default quality gets an average of 61.5 FPS. In average scenes the game is between 60-70 FPS, dropping slightly below 60 FPS in a heavy scene. This is a great time to use Dynamic Resolution with a frame rate limit of 60 to ensure a smooth 60 FPS.
Finally, how will the 1080 look at 1080p?
Figure 12 – GeForce 1080 with Recommended Settings @ 1080p.
An average GPU framerate of 113.2 FPS is great, but that’s not the reality of what you’ll see in game. Here we see the effect of becoming CPU bound. 60% of frames are bound by the CPU, lowering the effective average framerate to somewhere closer to the render threads average of 104.6 FPS. What this illustrates is the GPU can render so fast that the CPU can’t keep up with its demands. A faster processor with more cores would help feed the GPU faster and open up the game to higher framerates.
In summary, Gears of War 4 is a demanding game with Ultra settings at 4K, in part because of a number of screen space settings in effect (Screen Space AO, Screen Space Shadows, Screen Space Reflections). Instead of locking away high quality but demanding visuals, we’ve given you the power to adjust the game to your preference of visual fidelity vs. performance
The benchmark represents expected performance in the Campaign. Our target for Multiplayer on Windows 10 is always at least 60 FPS using Recommended settings. In general, you can expect 1.5x to 2x the framerate in Multiplayer over single player. This means, for example, the GeForce 1080 will easily achieve 60 FPS in Multiplayer at 4K.
You can use the “Show Stats” setting set to “All” in the video options menu to detail your performance at all times in the lower right corner as seen in the below screenshot of the Multiplayer map Impact.
Figure 13 – GeForce 1080 at 69 FPS with Ultra Settings @ 4K in Horde Mode on Impact.
Now onto direct comparisons of the average framerate at each resolution using Ultra settings as a baseline with Recommended settings included for reference.
The 1080 is clearly the top performing card, but the GeForce 980’s recommended settings (which are the same as the 970) give it a good performance boost, it looks like the 980 has room to scale up some of the settings further. Only the 750 Ti drops below 30 FPS average at 1080p Ultra.
At 1440p we can see that the 970 is the line before we drop below 60 FPS at recommended settings. At Ultra, you’re going to need a 980 Ti, 1070 or 1080 to get 60 FPS.
Due to our requirement of 4GB minimum for 4K, a few cards drop off. In order to maintain a playable framerate, we think you should stick with the recommended settings at 4K (or reduce visual quality).
The Fury X is a great choice for top performance, but the new 480 can also maintain top visual quality and performance with Ultra settings at 1080p.
You’re going to want an R9 290 or above with Recommended settings to hit 60 FPS at 1440p. The Fury continues to deliver with 64.4 FPS at Ultra.
Only the Fury X will deliver above 30 FPS with Ultra settings at 4K. Stick with the Recommended settings to maintain a playable 30 FPS for everything else up to the 480.
Of course, we need an overall comparison of all cards to finish things off.
Tips and Troubleshooting
You can use the benchmark to investigate any performance problems you may be having due to external factors, or due to a specific setting. If investigating settings, read the descriptions to see which ones have the largest cost and start by reducing those as much as you feel comfortable.
Xbox App Game DVR
The background recording feature of Game DVR in the Xbox App can have a high performance cost. Here’s a benchmark point graph to illustrate:
Figure 14 – Impact of Game DVR Background Recording.
The consistent low drops were being caused by background recording quality set to 60fps. If you have similar performance problems, make sure background recording is set to normal quality levels or disable it completely.
Start -> Xbox App -> Settings -> Game DVR -> Background Recording = Off.
As mentioned previously in the guide, Ultra texture detail can have a performance cost on systems with a 4 core CPU and a fast GPU (with at least 6GB of VRAM). Ultra texture detail raises the minimum and maximum mipmap loaded for textures. The maximum mipmap is streamed in over time, but the minimum is loaded up front with the texture into VRAM. If the GPU is rendering sufficiently fast, it may become blocked by the CPU loading and decompressing the textures it needs. This is especially relevant with the World Texture Detail set to Ultra because of the amount of textures belonging to the World group.
Figure 15 – Stalls from a high performance GPU waiting for a slower CPU to load Ultra detail textures.
We recommend using Ultra textures on our Ideal spec CPU where there are more cores available to work on the texture loading.
In addition, setting texture details too high will result in larger textures being loaded into memory. Due to the extra memory used, this can result in running out of space quickly on GPU’s with lower amounts of VRAM (3GB or less), which will ultimately result in some objects not getting higher quality textures loaded. If you see this behavior, simply lower some of the texture detail settings. The video settings screen will warn you if you’ve selected an option that could result in this behavior.
Figure 16 – Warnings in the video settings from texture details being set too high.
The benchmark can help you discover that the CPU is under load from some other task.
Figure 17 – Point graph when CPU has contention from other apps.
In the above graph, the point graph is spread all over instead of loosely grouped. This is a result of running moderately CPU intensive tasks in the background, outside of the game.
Remember to turn Vsync off before benchmarking, otherwise you will not exceed the monitor refresh rate.
Figure 18 – Benchmark with Vsync on, framerate won’t go above the monitor refresh rate.
Hopefully this guide has helped you get ready for great performance from Gears of War 4 on launch day. With over 30 settings at your command there is plenty of room to make adjustments to suit your needs, or you can use our Recommended Settings and be guaranteed a good experience. We hope you enjoy playing the game in all its glory on Windows 10 within the next week!