Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

G-Sync vs. FreeSync FAQ: How variable refresh rate displays make PC games super-smooth

Jason Evangelho | Aug. 31, 2015
Imagine games without stuttering or tearing. Games without ghosting. Two rival technologies both promise that—learn more about them here.

Variable refresh rate monitor.

That jumble of words isn’t rocketing to the top of any “sexiest tech phrases” list anytime soon. Nevertheless, it’s quite literally game-changing technology that’s just beginning to seep into mainstream awareness and adoption. You may know this technology by another, slightly more memorable pair of names from two companies driving your PC gaming experience: Nvidia’s G-Sync and AMD’s FreeSync.

Despite being relatively new, dozens of G-Sync and FreeSync monitors are available to satisfy a broad range of cravings. You'll find models priced from $199 to north of $1000, encompassing 1080p, 4K, and even gorgeous curved 1440p UltraWide displays.

So what’s the big deal? Do you need G-Sync or FreeSync in your life? Does it cost more? Are there any deal-breaking drawbacks? Is this tech restricted to desktop use? Is your current video card compatible? Sit back, grab a beverage, and let’s tackle these pressing questions.

What’s so special about G-Sync and FreeSync?

Ever since we began manipulating onscreen gaming graphics with a keyboard and mouse, the two crucial pieces of hardware in that equation have been butting heads. Your video card is impatient to push image frames to your monitor. But if your monitor’s refresh rate is fixed at something like 60Hz, that beautiful frame of animation comes along and the monitor isn’t ready for it. You only see part of what’s happening: a portion of the current frame, and a portion of the next frame. It looks as if the picture were trying to split itself in two and take off in different directions, and it only worsens the more dynamic your game’s frame rate becomes.

screen tearing example

Another name for this is screen tearing, an ugly artifact that’s become something PC gamers grudgingly accept as reality. But it's more than an annoyance — it's the difference between in-game life and death. Say you’re playing Battlefield 4 and a sniper camping on some mountain peak takes aim at you. The glint of his scope against the sunlight would give him away, except you didn’t see it because it took place on that fragment of a frame your monitor rejected. Sure, it’s an extreme case, but it punctuates the very real problem.

The existing workaround is the V-Sync setting on your graphics card. Sadly, in solving one problem this introduces another — a scenario where your monitor is calling the shots. Now when your GPU is ready to deliver that frame, the monitor says “wait a few more milliseconds! This silly gamer doesn’t want screen tearing.” With V-Sync on, this manifests itself as “stutter,” or seeing the animation last a touch longer than it’s supposed to. It can be a little jarring, and make the game you’re playing feel sluggish.

 

1  2  3  4  Next Page 

Sign up for CIO Asia eNewsletters.