Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Hands-on with AMD's FreeSync: The technology that could kill Nvidia's G-Sync

Gordon Mah Ung | March 20, 2015
If there's one thing tech market doesn't need, it's another standards cat fight. But you survived Firewire vs. USB, HD-DVD vs. Blu-ray, and RDRAM vs. DDR so get ready for the battle between Nvidia's G-Sync and AMD's FreeSync to kick into high gear.

If there's one thing tech market doesn't need, it's another standards cat fight. But you survived Firewire vs. USB, HD-DVD vs. Blu-ray, and RDRAM vs. DDR  so get ready for the battle between Nvidia's G-Sync and AMD's FreeSync to kick into high gear.

That war formally kicked off this morning, with AMD announcing that no fewer than four monitors supporting its sync technology were finally available for sale in the U.S., and another seven are expected soon. By the end of the year, the company says, expect 20 monitors supporting FreeSync to be available.

FreeSync and G-sync, if you didn't know, are technologies from the leading graphics companies that let the graphics cards synchronize the display of a frame in a video game with the actual output of the video card. Both promise to eliminate tearing and stuttering in games, but neither are compatible and they take different routes to get there.

Nvdia's G-Sync was out the chute first, having been announced way back in October of 2013. From all accounts it was impressive. The problem was actually getting G-Sync monitors. The first one's were based on existing stocks of high-frame-rate 3D monitors, modified by OEMs, and it would take months for the press to touch the panels let alone the public. G-Sync puts actual hardware inside the monitors that communicate with Nvidia's modern GPUs. The hardware, naturally, is sold only by Nvidia.

Within months of Nvidia announcing G-Sync, AMD introduced FreeSync. Instead of panel makers adding modules to the monitors, AMD's version would rely on an idea already being kicked around to vary refresh rates in laptops to save power. An unlike Nvidia's tack to get hardware out as soon as possible by using proprietary components, AMD proposed putting variable sync matching directly into the specs, so future monitors would support it. AMD was successful and VESA, the group that blesses monitor standards, baked Adaptive Sync into DisplayPort 1.2a last April.

G-Sync actually works

My experience with Nvidia's G-Sync aside from trade shows and demos has been limited to an Acer 4K G-sync panel. Outside the control of any company, and without someone peering over my shoulder while I mucked with it, I'd have to say G-Sync is exceedingly easy to setup. You just need a compatible GeForce GPU (which is pretty much most of Nvidia's modern GeForce 6, 7, 8, and Titan GPUs) and a driver with G-Sync support. Switch it on in the Nvidia control panel and you're done.

On the Acer 4K panel I drove, the impact is fairly impressive. Not only is stutter and tearing reduced to almost nothing, you can actually play games at lower frame rates you'd normally turn your nose up at. That was one of the early arguments for G-Sync too. Even though it added cost to the monitors, you could ease back a bit on your GPU budget to fund the purchase, since a frame rate of 45 frames per second would be acceptable with G-Sync.

 

1  2  3  4  Next Page 

Sign up for CIO Asia eNewsletters.