Creative Disruption

THE WIRED ECONOMY Productivity rates usually fall in a recession. Not this time. In 1987, Nobel Prize-winning economist Robert Solow famously joked that he could see computers everywhere but in the productivity statistics. Despite decades of massive technology investments, corporate America's productivity rate – the output of the average worker – was growing no faster […]

THE WIRED ECONOMY

Productivity rates usually fall in a recession. Not this time.

In 1987, Nobel Prize-winning economist Robert Solow famously joked that he could see computers everywhere but in the productivity statistics. Despite decades of massive technology investments, corporate America's productivity rate - the output of the average worker - was growing no faster than before. Because productivity is the main ingredient in overall economic growth, this was more than just a statistical puzzle; it was also a disappointment on a grand scale.

Then came the mid-'90s, and suddenly all the pieces fell together. As networked PCs arrived in the workplace, combining all those bits of technology, productivity shot up, propelling the stock market before it and giving rise to the notion of a supercharged new economy.

Now, since the stock market crash, the skeptics are again holding sway. Technology's effect on the economy is easy to exaggerate, they argue; most of the gains to date are an artifact of the bubble and the free-spending ways it encouraged. The new economy is not so much dead as it never existed; the ruins of the Nasdaq are the price of the fantasy. This is a popular theory these days. Rather awkwardly, however, the statistics run contrary to the skepticism. In fact, the latest evidence points the other way.

In February, the Labor Department released new productivity numbers for the final three months of 2001. Normally, productivity slows in a downturn as companies struggle with excess inventory and staff. This time, however, it seems to be surging: The 3.5 percent annual growth rate for the period is well above the 2.6 percent average rate for the late '90s and puts the 1.7 percent rate of the preceding three decades to shame. It is also the single greatest factor behind the economy's miraculous ability to show growth in the fourth quarter.

Since the recession began last March, the productivity rate has consistently defied those who thought it would plunge. Apparently, companies really have changed how they operate: They are now far more nimble, able to cut workers and production more quickly than ever before. Seen this way, the flood of layoffs over the past year is evidence for a new economy - not against it.

Add a few other indicators - such as the surprising prospect of the mildest recession since World War II following the biggest boom - and one might argue that the new economy is alive and well, despite the downturn. Indeed, in February the White House Council of Economic Advisors released a report concluding just that.

Then why are so many economists unconvinced? Because the numbers only hint at the underlying changes, providing a cloudy window on a new world - and they are unlikely to improve much soon. There is nothing controversial about the notion that technology increases productivity - think of an accountant switching from an adding machine to Excel. Knowing how to measure the rise, however, can be vexing.

When technology allows fewer people to do the same work (factory automation) or the same number of people to do more work (that accountant), it shows up on a spreadsheet. But what about an Intel factory employee producing consistent quantities of ever-more powerful chips? Increased power equals increased productivity - but such qualitative increases don't show up in quantitative studies. Measuring more work is easy, but measuring better, higher-quality work is hard.

In the absence of satisfying statistics, the debate over the very existence of a new economy has hinged on anecdote and theory. But history can help, too, and it suggests that more time will bring stronger evidence. The one undeniable fact about disruptive technologies is that they're not easily substituted for older technologies.

The introduction of electricity, for instance, meant replacing one big steam engine with many small motors, turning a centralized factory into a distributed one or oftentimes just building a new factory from scratch. As a result, it took about 50 years for electricity to be adopted by half of all factories and for its effect to show up in the productivity tables. Earlier, steam power had launched a similar - and similarly protracted - revolution. Both steam and electricity created new economies of their own, eventually.

Seen from this perspective, the productivity gains of the late 1990s aren't the whole story: The upturn is likely explained by the PC's omnipresence finally showing up in the numbers, or - as the skeptics argue - it could be construed as a statistical artifact of the tech boom. The Internet has yet to have its full impact.

But the wait could be a short one. Disruptive technologies do best when introduced into a period in the midst of great change; there is less to unlearn. This is the case for digital networks. The economy was already shifting from manufacturing to services, and the first wave of information technology had already arrived - the groundwork was done. A new model by the National Bureau of Economic Research suggests that, in such conditions, the halfway adoption mark can be reached in a little more than a decade.

And the well-measured productivity gains that ensue? Sadly, government statisticians are not nimble creatures like factory workers; the forecast for continued hazy numbers is long.