Free cooling: Hip or hype?

Skyrocketing energy costs and the corporate world’s need to go green have compelled many data centre administrators to begin looking at new ways to cut their cooling and power spending.

Of course, we’ve seen the meteoric rise of system virtualization and cloud computing as a response to this issue, and most of the IT industry has reached a consensus that these technologies are indeed a viable way to consolidate and save on energy costs.

New to the power-saving debate is the notion of free cooling. Some have speculated that cold-weather countries like Canada might offer an advantage in what analysts will agree is one of the most expensive aspects of keeping a data centre up and running.

Some countries are even encouraging international businesses to consider setting up IT shops within their borders.

More from ComputerWorld Canada

Canada as a data centre destination?

Last year, the Invest in Iceland Agency touted its country as one of the most competitive locations in the world for data centre operations. The country cited cheap and abundant power, low corporate taxes and an overall chilly climate as lures.But it’s not as easy as simply opening up the windows and letting the cold air come in. Taking advantage of “free cooling” might actually take some time, effort, and money to achieve.

We talked to a variety of industry players about free cooling, how it fits in our increasingly cash-strapped IT organizations and whether or not the technology is an actual game-changer.

The case for free cooling

According to recent data compiled by American Power Conversion Corp., Cisco Systems Inc. and Emerson Network Power, half of all data centre energy consumption goes to cooling systems.

Other sources have said that cooling often fits a one-to-one ratio. “For facilities cost, which is mostly cooling and energy costs, a company spends about one dollar on cooling for every one dollar it spends on computing,” said Charles King, principal analyst with Pund-IT Research Inc.

With numbers like that, the benefits of cutting down on energy consumption is unquestionable. But, despite this opportunity, the idea of free cooling has not yet garnered much interest around the world.

“Not a lot of customers have taken an interest in free cooling because they really don’t know about it yet,” said Brian Fry, vice-president of sales and co-founder at RackForce Hosting Inc. of Kelowna, B.C. “There’s a very small percentage of the world that even understands carbon footprints in the first place.”

Fry’s company has teamed up with IBM Corp. to construct a 150,000-square-foot green data centre in Kelowna. The Gigacentre, which opens this spring, will allow companies to rent out space in what RackForce is calling one of the most energy-efficient data centres in the world.

A key part of the new facility, Fry said, is its free cooling mechanism. “The chiller technology has enough intelligence to know that, once it gets below 10 degrees Celsius, it will automatically shut down the evaporative cooling systems — a technology used to evaporate water through cooling machines,” he said. “It just starts to take the cold air from the outside, and filter it into the data centre. We don’t have to use the normal chiller processes.”

Fry said that, while a company might spend up to twice as much on the chiller systems to be able to harness the outside air, the savings can be seen fairly quickly and the technology will essentially pay for itself.

And, if new U.S. President Barack Obama has his way, he added, a world of carbon credits will boost the demand for free cooling.

“You’re going to have a substantially smaller carbon footprint,” Fry said. “In the next few years, when carbon credits become a part of Obama’s world — and hopefully Prime Minister Harper’s as well — the investment will pay off even more.

“If you have to buy a whole bunch of carbon credits, it’s going to cost a fortune to choose a location in Texas, Virginia or Alberta, because you’re going to have up to 100 times the carbon footprint in those places.”

One of the biggest arguments against building a data centre in a remote, cold-weather location is that “you won’t have the network up in those places,” Fry said. “But that’s really an old wives’ tale now,” he added. “You’ve got fibre networks that run just about everywhere. As long as you deal with redundant networks coming from two directions, and you put the right electronics on each end of that fibre, you can do just about anything.”

King agreed, saying that the availability of high throughput networking is so common now that proximity to company headquarters is not really an issue.

Darin Stahl, lead analyst with Info-Tech Research Group, added that, while many in the industry believe that the data centre “air-handling” market is now stagnant and has reached its peak, the truth is that huge efficiencies have been gained in recent years. “The average life span of a server room is between 10 and 15 years,” Stahl said. “During that time, equipment and technology will change two to three times.”

He said an air conditioning unit in a data centre 15 years ago would have a maximum heat removal capacity of seven watts per square foot. “But, today, it’s about 50 watts per square foot, and more high-end precision and balanced solutions can achieve up to 250 watts.”

Free cooling is just another redundant system that Canadian-based data centres might be able to take advantage of during a substantial portion of the year, he added.

Why you don’t need free cool to be cool

But companies will, however, have to carefully examine whether the technology will save more than it will cost.

“You’ll probably save some energy costs, but what you don’t really do is achieve a full offset,” Stahl said. “Because now you’ve got to introduce a whole bunch of other operating and power mechanisms before bringing the air into the data centre.”

In the case of data centres located around the Pacific west coast, Stahl said that unfiltered sea air can bring unacceptable levels of static and humidity into the data centre.

“The basic premise that you’re just going to open the windows and take advantage of cooling is (foolish),” he added. “We also don’t know the total cost of ownership and the kind of maintenance issues you’re going to see, because free cooling hasn’t been around that long.”

John Smith, vice-president at the Minneapolis, Minn.-based engineering consulting firm Michaud Cooley Erickson, said that companies with data centres in cold-weather climates would be wise to

Related Download
3 reasons why Hyperconverged is the cost-efficient, simplified infrastructure for the modern data center Sponsor: Lenovo
3 reasons why Hyperconverged is the cost-efficient, simplified infrastructure for the modern data center
Find out how Hyperconverged systems can help you meet the challenges of the modern IT department. Click here to find out more.
Register Now