eBay Cools Desert Data Center With Hot Water

The typical data center is air conditioned. As massive electrical units work overtime to cool servers and other hardware, the temperature hovers somewhere between 65 and 80 degrees Fahrenheit. But in an effort to save power and money -- and ultimately the environment -- eBay is running its new Phoenix data center at a sweltering 115 degrees. The facility runs at temperatures that are so high, the company can cool it with hot water.
Image may contain Word File and Text
In the minutes following MySpace Music's announcement, we reached a dead-end when trying to reach imeem.com.

The typical data center is air conditioned. As massive electrical units work overtime to cool servers and other hardware, the temperature hovers somewhere between 65 and 80 degrees Fahrenheit. But in an effort to save power and money -- and ultimately the environment -- eBay is running at least part of its new Phoenix data center at temperatures as high as 115 degrees. The temperatures are so high, eBay can cool the facility with hot water.

According to eBay, the water in its outdoor tower tank can reach 87 degrees on the hottest of summer days, and even this is cool enough to operate the heat exchangers that keep the facility's servers from reaching the extreme temperatures where they might malfunction.

"If we can solve this in a desert," says Dean Nelson, eBay's senior director of global foundational services and the man who oversaw the project, "we can solve this almost anywhere."

Nicknamed "Project Mercury," eBay's data center is part of a sweeping movement to reduce power consumption and save costs in the massive computing facilities that run the web's big name services. Google has long advocated hotter data centers, insisting that servers run just fine at temperatures well above 80 degrees, and countless other operations have followed suit.

Many of these outfits have also built data centers in mild climates so that servers can be efficiently cooled with the outside air, and Google has even gone so far as to cool its Finland data center with the frigid water from the Baltic Sea. But eBay's facility is different. It has achieved seemingly unprecedented efficiencies in a rather warm climate.

eBay didn't put its facility in Phoenix just to prove a point. It put the facility there because it needed a data center in the area. Then it found a way to run it efficiently. "Wherever we have our data centers," Nelson says, "we're going to optimize our infrastructure."

The facility includes traditional server rooms that sit atop a raised floor so that cool air can be pushed into the machines, but eBay has also placed data center "containers" on the roof of the building. Pioneered by Google, these contraptions resemble traditional shipping containers. Pre-packed with servers and other gear, they let you piece together data centers in much the same way you'd stack building blocks.

According to an official report from The Green Grid -- a non-profit consortium of tech companies that works to optimize data center design -- the entire facility has a power unit effectiveness (PUE) of 1.35 when there's a 30 to 35 percent load on the servers. And the best case scenario is a PUE of 1.26.

PUE measures the efficiency of a data center, and the closer to 1, the better. It's the ratio of total amount of power used by a data center to the power delivered to computing equipment. A PUE of 1.35 is good but not great. Google says that in 2011, the average PUE for all of its data centers was 1.14. But eBay -- a member of The Green Grid -- has achieved far better results with its rooftop containers.

If you consider the containers on their own, their PUE was a mere 1.018 when they were in operation this past January. And even when the outside temperature reached 115 Fahrenheit in the middle of August, the report says, the 20-rack rooftop containers still has a PUE of 1.046. Up to 12 containers can sit outdoors on the roof, in direct sunlight.

There are caveats. The "partial" PUE ratings don't consider some of the upstream transformers and power sources that are required to operate the entire Phoenix data center. But as Data Center Knowledge points out, a 1.018 appears to be unprecedented.

John Pflueger -- the principle environmental strategist at Dell and the company's representative to The Green Grid -- says "Project Mercury" is result of big-name companies sharing their data center strategies with the rest of the world. Other Green Grid members include Facebook, Microsoft and HP.

"This resets the bar for what's possible in the data center," he says. "The companies whose data centers are like factories are really the ones [incentivized] to squeeze efficiencies out of their data centers."

But eBay's Nelson says that cooperation isn't the only way to improve data center design. You need a healthy amount of competition as well. At its Phoenix site, eBay has actually pitted Dell and HP against each other to see who can provide better better hot water cooling system. Dell has implemented a tower cooling system, while HP is using an adiabatic system, which uses sharp pressure alterations to change water temperature. Nelson declines to say who's winning, but the point is that both are determined to do so.