Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Choose the right graphics card: 2012 edition

Loyd Case | Aug. 13, 2012
Modern graphics cards are intimidating, hulking beasts in a world of increasingly tiny PC components. Most of them are double-wide, occupying two expansion-slot spaces, even though they use only a single physical slot. Many require two power connectors and beefier-than-average power supplies. Their primary audience appears to be serious PC gamers, who use an arcane jargon of their own: frame rates, VSync, antialiasing.

Roleplaying games vary in their graphics requirements, but most are playable on midrange graphics cards priced at between $200 and $300 range. One exception to this rule is The Witcher 2, an extremely demanding game in part because it tries to do too much with DirectX 9, an older graphics interface. Still, even most action RPGs like Diablo III and hybrid shooter RPGs like Mass Effect 3 run acceptably on modestly priced hardware.

Massively multiplayer online games--particularly free-to-play titles--usually avoid taxing GPUs beyond what most current systems can handle, because they're trying to attract as large an audience as possible.

Indie games of all stripes tend to trail the graphics leading edge, and in many instances you can get by with a lower-cost card when playing them.

Finally, if you mostly play older games, your GPU needs are probably relatively modest.

What Other Applications Do You Run?

GPU-accelerated apps are becoming more common. The earliest use of GPU acceleration for consumer applications was for video transcoding. Applications such as CyberLink's Media Expresso have added support for additional graphics hardware and application programming interfaces (APIs) over time.

Photo- and video-editing applications followed. The latest version of Photoshop CS6 uses OpenGL for most of its rendering and GPU compute for accelerating the filters in the blur gallery. Musemage, developed in China, is a completely GPU-accelerated photo-editing application.

Windows 8 will use GPU acceleration in all 2D rendering, and Microsoft Office 2010 already supports graphics acceleration for Excel and PowerPoint charts. Even games use the GPU for more than just graphics--to handle accelerating physics, fluid dynamics, and special-effects calculations.

Web browsers are taking advantage of graphics cards, too. Google Chrome and Firefox take advantage of WebGL for 3D acceleration. Chrome, Internet Explorer 9, and Firefox use the GPU to accelerate 2D page rendering. If you use your GPU for nothing more than accelerating Windows and Web browsers, modern integrated graphics such as Intel's HD 4000 (included with all mobile Ivy Bridge processors) and AMD's Radeon GPU integrated into all A-series processors are good enough. But if you do more-demanding work, you may want a discrete graphics card. Even then, however, for most normal desktop use, you don't need to spend a bundle; a card priced at between $150 and $250 is definitely good enough. The biggest problem with GPU-accelerated apps is that the concept is still fairly young, and therefore fragmented. Some applications may use Nvidia's proprietary CUDA framework exclusively. Others may use only Microsoft's DirectCompute programming interface, which supports all current hardware, but only on Windows machines. A few applications now use OpenCL, an open interface that works with Windows, Mac OS X, and Linux. However, not all cards have current OpenCL drivers. So you have to confirm that both your graphics card and your application will work with each other. Of course, the CPU can always serve as a fallback, in case GPU acceleration doesn't work for your particular application.


Previous Page  1  2  3  4  5  6  Next Page 

Sign up for CIO Asia eNewsletters.