PC graphics options explained

VIDEO: Tim goes over some common, and commonly misunderstood, graphics settings. (Note: Filmed before multiple RTX-enabled games were available.)

Nvdia and AMD both offer tools to select the optimal graphics settings for the games you own, and both do a fine job balancing quality and performance. They really work pretty well, but I just like doing things myself. It's the PC gamer way, right? We tinker on our own terms.

If you're new to graphics tuning, this guide will explain the major settings you need to know about, and without getting too technical, what they're doing. Understanding how it all works can help with troubleshooting, setting up the most gorgeous screenshots possible, or playing with tools like Durante's GeDoSaTo.

We start with the fundamental concepts on this page. For the sections on anti-aliasing, anisotropic filtering, and post-processing that follow, I consulted with Nicholas Vining, Gaslamp Games' technical director and lead programmer, as well as Cryptic Sea designer/programmer Alex Austin. I also received input from Nvidia regarding my explanation of texture filtering. Keep in mind that graphics rendering is much more complex than presented here. I'm a technology enthusiast translating these systems into simple analogies, not an engineer writing a technical paper, so I'm leaving out major details of actual implementation.

Resolution and FPS

A pixel is the most basic unit of a digital image—a tiny dot of color—and resolution is the number of pixel columns and pixel rows in an image or on your display. The most common display resolutions today are: 1280x720 (720p), 1920x1080 (1080p), 2560x1440 (1440p), and 3840 x 2160 (4K or ‘ultra-HD’). Those are 16x9 resolutions—if you have a display with a 16x10 aspect ratio, they’ll be slightly different: 1920×1200, 2560x1600, and so on while newer ultrawide displays can be 2560x1080, 3440x1440, etc.

Frames per second (FPS)

If you think of a game as a series of animation cells—still images representing single moments in time—the FPS is the number of images generated each second. It's not the same as the refresh rate, which is the number of times your display updates per second, and is measured in hertz (Hz). 1 Hz is one cycle per second, so the two measurements are easy to compare: a 60 Hz monitor updates 60 times per second, and a game running at 60 FPS should feed it new frames at the same rate.

The Complete Guide to PC Gaming

PC Gamer is going back to the basics with a series of guides, how-tos, and deep dives into PC gaming's core concepts. We're calling it The Complete Guide to PC Gaming, and it's all being made possible by Razer, which stepped up to support this months-long project. Thanks, Razer! 

The more work you make your graphics card do to render bigger, prettier frames, the lower your FPS will be. If the framerate is too low, frames will be repeated and it will become uncomfortable to view—an ugly, stuttering world. Competitive players seek out high framerates in an effort to reduce input lag, but at the expense of screen tearing (more on that below), while high-resolution early adopters may be satisfied with playable framerates at 1440p or 4K. The most common goal today is 1080p/60 fps, though 1440p, 4K, and framerates above 120 are also desirable. A high refresh rate monitor (120-144 Hz) with the framerate to match is ideal. 

Because most games don't have a built-in benchmarking tool, the most important tool in your tweaking box is software that displays the current framerate. ShadowPlay or FRAPS work fine in many games, or you can use utilities like Riva Tuner for more options on what to show. (Note that some newer DX12 and Vulkan games may not work with many framerate overlay tools that worked fine with earlier DX11 games.)

Upscaling and downsampling

Some games offer a 'rendering resolution' setting. This setting lets you keep the display resolution the same (your display's native 1080p or 1440p, for instance) while adjusting the resolution the game is being rendered at (but not the UI). If the rendering resolution is lower than your display resolution, it will be upscaled to fit your display resolution—and, as expected, look like garbage, because the image is being blown up.

If you render the game at a higher resolution than your display resolution, which is an option in Shadow of Mordor, the image will be downsampled (or 'downscaled') and will look much better at a high cost to performance.

Performance

Because it determines the number of pixels your GPU needs to render, resolution has the greatest effect on performance. This is why console games which run at 1080p often upscale from a lower rendering resolution—that way, they can handle fancy graphics effects while maintaining a smooth framerate.

I benchmarked the Shadow of Mordor resolutions above (all settings on maximum) on the Large Pixel Collider (specs here—two of its four GTX Titans active) to show how much resolution affects performance.

Shadow of Mordor resolution benchmarks (2x Nvidia GTX Titan SLI)

Swipe to scroll horizontally
Row 0 - Cell 0 Avg. FPS Max FPS Min FPS
1980x720 (1/2 resolution)102
338
30
2560x1440 (native resolution)51
189
23
5120x2880 (2x resolution)16
26
10

Vertical sync and screen tearing

When a display's refresh cycle is out of sync with the game's rendering cycle, the screen can refresh during a swap between finished frames. The effect is a ‘break’ called screen tearing, where we're seeing portions of two or more frames at the same time. It is also our number one enemy after low framerate.

One solution to screen tearing is vertical sync (vsync). It’s usually an option in the graphics settings, and it prevents the game from messing with the display until it completes its refresh cycle, so that the frame swap doesn't occur in the middle of the display updating its pixels. Unfortunately, vsync causes its own problems, one being that it contributes to input lag when the game is running at a higher framerate than the display's refresh rate (this AnandTech article explains it in technical terms).

Adaptive Vertical Synchronization

The other big problem with vsync happens when the framerate drops below the refresh rate. If the framerate exceeds the refresh rate, vsync locks it to the refresh rate: 60FPS on a 60Hz display. That's fine, but if the framerate drops below the refresh rate, vsync forces it to jump to another synchronized value: 30 FPS, for instance. If the framerate fluctuates above and below the refresh rate often, it causes stuttering. We'd much rather allow the framerate to sit at 59 than punch it down to 30.

To solve this, Nvidia's Adaptive Vertical Synchronization disables vsync anytime your framerate dips below the refresh rate. It can be enabled in the Nvidia control panel and I recommend it if you're using vsync.

G-sync and FreeSync

New technology is starting to solve this big mess. The problem all stems from one thing: displays have a fixed refresh rate. But if the display's refresh rate could change with the framerate, we could eliminate screen tearing and eliminate the stuttering and input lag problems of vsync at the same time. Of course, you need a compatible video card and display for this to work, and there are two technologies that do that: Nvidia has branded its technology G-sync, while AMD's efforts are called Project FreeSync.

Initially, Nvidia cards only supported G-sync monitors, but GeForce cards will now work with some FreeSync monitors. However, only Nvidia GPUs can use variable refresh rates on G-Sync displays. Check our monitor buying guide for advice.

Tyler Wilde
Executive Editor

Tyler grew up in Silicon Valley during the '80s and '90s, playing games like Zork and Arkanoid on early PCs. He was later captivated by Myst, SimCity, Civilization, Command & Conquer, all the shooters they call "boomer shooters" now, and PS1 classic Bushido Blade (that's right: he had Bleem!). Tyler joined PC Gamer in 2011, and today he's focused on the site's news coverage. His hobbies include amateur boxing and adding to his 1,200-plus hours in Rocket League.