Data centre designers share secrets of going green

Rising utility costs from running denser and hotter server and storage gear is forcing IT managers to redesign data centres to be more energy-efficient.

Take the case of Dan Wilson, IT operations manager for Osram Sylvania in Danvers, Mass. Getting an updated electrical bill prompted his data centre redesign.

“We were being charged a pro-rated amount that had been determined several years before that no one had ever updated internally when energy costs rose,” Wilson says. “When accounting did upgrade it, we said ‘Oh wait, they are hitting us with a real bill now.’ The [amount] came as a shock to us.”

Wilson manages two data centres for the lighting manufacturer — one in Danvers and the other located above a loading dock in a facility in Manchester, N.H. When trucks enter or leave the loading dock, much of the benefit of the air-conditioning in the data center is lost as the doors open and close. Wilson’s 2,000-square-foot data centre suffers not only when the overhead doors open but also from the fact that the sun beats down on an exposed wall of the data centre and on the roof. Also, when Osram Sylvania designed its data centre it put racks of servers parallel to the air conditioning units, thus cutting their effectiveness. “We set up the data centre so it looked good,” he says.

In an effort to make its data centres more energy efficient, Osram Sylvania hired Degree Controls to conduct a thermal study, which pinpointed hot spots where excess energy consumption occurred. Wilson installed sensor-based and fan-assisted AdaptivCool floor tiles that monitor the data centre for hot spots and then dynamically manage the flow of cooling to racks and exhausts hot air back to the computer room air conditioner (CRAC) intake.

“We have four sensors spread down the row of servers,” he says. “When those sensors determine that we’ve hit a certain temperature collectively, it will kick in the fan to push more cold air up at the far end of the row.”

Wilson is also reorganizing his primary data centre, eliminating half of the nine racks of servers by virtualizing IBM AIX systems. He plans to realign the remaining servers into hot and cold aisles.

The costs of getting it cool

The proper and efficient cooling of data centers has gotten a lot of attention lately from data centre designers and other IT professionals. Research firm Gartner predicts that by the end of 2008, nearly half of data centres worldwide will not have sufficient power and cooling to support new high-density server and storage gear. Gartner estimated that in 2005 alone companies spent US$6 billion powering data centres in the United States. By 2011, more than a third of data centre budgets will be allocated to environmental costs, the market watcher estimates.

Data centres are going green in large part because of rapidly escalating energy costs, but also to be a good “corporate citizens.” At 365 Main, a data centre builder in San Francisco, designers have taken building green to heart.

In Newark, Calif., 365 Main has built from the ground up the first LEED (Leadership in Energy and Environmental Design) certified data centre in the United States. The facility, which tops out at over 136,000 square feet of floor space, features the use of recycled and locally obtained building materials, computer room air handlers and CRACs that consume 30 per cent less energy than traditional CRAH units, make-up air handlers that use outside ambient air, and energy-efficient lighting and water-efficient landscaping.

“The air handler units feature variable-state motors that by ramping up and down use 30 per cent to 60 per cent less energy and still keep the facility cool,” says Miles Kelly, vice president of corporate strategy.

Kelly is also using a free source of energy — the ambient outside air — to cool the Newark data centre.

“We are also using a cooling technology that when the outside temperature is less than or equal to 60 degrees, we will be using outside air to cool the facility,” says Kelly. “In the Newark data centre, for example, we expect to be able to use ambient air at least a third of the year if you consider these facilities are running 24 hours a day seven days a week. The result is we will be able to reduce our yearly operating cost by $500,000.”

Despite such cost savings, there is a premium to building energy-efficient data centres.

“It’s not a zero-cost leap to go [green],” says Kelly. “We expect to add 10 per cent to the build-out cost. So for a $100 million facility, that’s a $10 million increase in construction costs. That’s a 20-year payback if you were simply to look at the $500,000 savings per year.”

But there are bigger issues at play than just the cost of being green. In Kelly’s data centere he was able to lighten the overall footprint of the facility by using recycled or locally sourced materials.

“You get points with the U.S. Green Building Council and the LEEDs certification to feature a higher percentage of recycled materials,” says Kelly. And, utilities such as Southern California Edison and Pacific Gas and Electric are offering incentives and programs for those building green data centres.

“We want to participate in these programs but not diminish our services to our clients,” says Jean Paul Balajadia, vice president of operations and founder of 365 Main. “Some [programs] are set up to say ‘if we give you 24-hour notice, we need you to turn off your power’ — that’s just not acceptable in a data center environment. There are other programs open to us though, such as the Critical Peak Pricing Program, which saved us $70,000 over the six summer months last year.”

With PG&E’s Peak Pricing Program, customers are notified a day before a critical power peak event occurs. Usage during summer peak hours is discounted on days when no events are called.