Like a wisecracking sidekick who winds up stealing the movie from a too-bland lead actor, graphics processing units are edging more general-purpose central processing units out of the limelight.
"There's this conventional wisdom [that the] GPU equals games, and a fast PC is a fast CPU," says Rob Csonger, Nvidia's vice president of corporate marketing. "The truth today is the GPU is accelerating everything because everything is rendered now."
Over the past several years, graphics processing units have evolved from highly specialized components coveted by Mountain Dew-swilling Unreal Tournament devotees to high-performance computing engines used by academic researchers. The latest shift has seen yet another transformation of the GPU into a fully programmable, open-architecture chip, in some cases just as flexible as – and packing far more parallel-processing power than – today's general-purpose central processing units.
The evolution of the GPU has prompted changes throughout the computer industry, from PC manufacturers who are modifying systems to better take advantage of GPUs, to software makers who are adding features designed to exploit the now-ubiquitous graphics chips.
Recent demos by Adobe showing how Photoshop and Flash might make use of GPU acceleration are merely the latest in a parade of software and hardware vendors copping to the power of the GPU.
While much of the GPU market these days is still anchored to the videogame market, graphics rendering has become increasingly important to a wide range of ordinary computing tasks. On the mobile front, the iPhone and iPod Touch, both of which use a version of Imagination Technologies' PowerVR MBX mobile graphics processor core, have cemented the notion that whizzy graphics capabilities can add exponentially to user experiences – especially on touchscreen devices. Other handset manufacturers, such as Nokia and Sony Ericsson, have also started incorporating robust 3-D graphic acceleration chips into their high-end phones. And modern operating systems, like Microsoft's Vista and Apple's Leopard, can barely open a text file without making heavy use of the GPU, thanks to their 3-D interfaces and slick visual effects.
What's more, the GPU's parallel architecture makes it well suited to a variety of modern computing tasks.
"When you look at the GPU what you're basically looking at is a highly parallel processing engine," explains Mercury Research analyst Dean McCarron. While today's top-end CPUs boast four cores, GPUs have anywhere between 80 and 128 cores. That makes them particularly adept at doing tasks that require a lot of simultaneous number crunching, such as 2-D and 3-D graphics, but also cryptography, scientific modeling, transcoding HD video streams and even running financial market simulations.
Many high-end GPUs also include a video unit for faster encoding and decoding of video data, which companies like Elemental Technologies are already taking advantage of with new GPU-accelerated video-processing software.
"Ultimately, everything you now see on your computer now touches the GPU in some way or another," notes McCarron.
The GPU's increasing clout is also starting to have a profound effect on how manufacturers and chipmakers build computers.
For instance, Gateway recently introduced a budget gaming laptop, the P-6831 FX, that makes use of a mid-range GPU (the Nvidia GeForce 8800M) to compensate for a relatively anemic CPU (a 1.6-GHz Intel Core Duo) – a strategy that gives the laptop decent performance with a budget $1,200 price tag. The laptop has been more or less sold out at Best Buy since its introduction early this year.
On the software side, consumer-oriented companies are also increasingly relying on the GPU.
Adobe recently announced that the forthcoming version of its Flash Player would start using GPU acceleration to support 3-D effects, video card acceleration and large bitmap images of up to 8,191 pixels per side.
"When you boil it down, the GPU is really just a type of CPU that is used for calculating floating point operations," says Tom Barclay, senior product marketing manager for Adobe's Flash Player. "With that, you get high bandwidth, you get additional memory, and you get what's basically a really versatile processor."
Cooliris is another company that figured out how to harness the GPU, in this case for a better web-browsing experience. Working with Nvidia, the company recently debuted an application called Piclens.
Instead of relying on the 2-D interface you get when hunting down pictures and videos on YouTube, Flickr or Google, Piclens renders all of those results as a glowing tower of images that you can scroll through and zoom in and out of effortlessly.
"People get caught up in the 3-D element of [Piclens] – the flashy element – but I think there is also a fundamental navigation problem we're solving," says co-founder Josh Schwarzapel. That is: How do we make a dauntingly large volume of content easily searchable?
As more and more of our personal content finds its way into digital form, graphics-intensive interfaces to that data, like Piclens and Delicious Library, will look less like visual frippery and will become essential tools for navigation.
In the end, the display may not be the computer, as Nvidia CEO Jen-Hsun Huang declared in a 2002 Wired magazine profile. But in today's computing environment, the pixel is definitely king. And that can only mean good things for the GPU's future.