How ‘hot’ data centres can cook your goose

COMMENT ON THIS ARTICLE

When Roger Hardy, IT director for the city of Jeffersonville, Ind., gets an alert from an automated monitoring system that his data centre air conditioning is failing, he has 20 minutes to fix the system before the computer room’s temperature reaches what he describes as its “death point.”

If the thermometer inside the data centre hits 33 degrees celsius, Jeffersonville’s IT equipment is cooked — literally.

That’s what happened this month when the city lost US$20,000 worth of equipment after its strained air conditioning system shut down during a spell of warm weather.

Hardy, who is the sole IT worker for the community of about 29,000 residents, was getting the approval of city officials this week for adding more chiller capacity.

That wasn’t an IT problem he expected when he took the job in Jeffersonville last December, since the city built a new data centre just last year.

But Hardy’s predecessor died very early in the project.

And later construction decisions didn’t fully account for future cooling needs, Hardy said, adding that it wasn’t until after he was hired that he discovered that the new computer room had only a fraction of the required cooling capacity.

To try to avoid unpleasant surprises like the one that occurred in Jeffersonville, IT managers are increasingly investing in computer-aided studies that map the airflow in data centres — similar to the computational fluid dynamics studies that automotive or aircraft manufacturers use to see how air moves around objects.

Even if an IT facility has ample cooling capacity, it could still have heat problems if equipment isn’t properly arranged.

High-density systems such as blade servers are particularly vulnerable to airflow problems.

But airflow studies can cost as much as US$150,000, said Mark Evanko, president and principal engineer at Edison, N.J.-based Bruns-Pak Corp., which conducts computational fluid dynamics studies as part of its data center engineering and design services.

The studies can be complicated, Evanko said. Assembling the data for a computerized model can involve going from rack to rack and verifying every aspect of airflow in a data centre, he said. The modeling also has to be able to account for possible changes in a data centre’s configuration.

Question of effectiveness

In addition, there is debate about how effective the computational fluid dynamics studies are within data centers. Studies of airflow “look good,” said John Musilli, a data centre operations manager at Intel Corp.

“But at the end of the day, it only works when you have a pristine design.”

Musilli, who also is a member of the Data Centre Institute think tank within the AFCOM professional association for data center managers, said that as soon as users begin adding equipment to data centres or moving systems around, it creates turbulence that can upset the airflow models.

Intel does use computational modeling in its data centers, but Musilli said that it is just one tool. The models “will tell you if you have a big problem,” he said. But they can show large, ominous-looking red areas over racks of servers that upon closer inspection “may not be significant,” according to Musilli.

Heat-related server problems may be obvious in some cases but less so in others. For instance, sporadic disk-drive failures may be attributable to normal mechanical problems and not necessarily to hot spots in data centres.

But Bob Sullivan, an engineer and consultant at The Uptime Institute Inc. in Santa Fe, N.M., said he believes that excessive heat is at the root of many IT equipment problems. “The problem is larger than people think,” he suggested.

The Uptime Institute looked at 30 computer rooms totaling 300,000 square feet of data center space and found that on average, 10 per cent of the server cabinets had hot spots — areas around them where the temperature was 25 degrees centigrade or higher.

Sullivan said IT managers can do a lot to control the hot-spot problem by placing thermometer strips on their IT equipment, checking them regularly and taking action if temperatures are rising.

Mark Levin, an independent consultant at Metrics Based Assessments LLC in Milford, Conn., said that although data on heat-related system failures is lacking, he has seen evidence that IT managers are scrambling to cope with the problem.

The need to do something is often apparent when Levin tours data centers. “When you walk through a data center and can feel the hot spots, you know there is a problem,” he said.

Hardy hopes to increase the cooling capacity in Jeffersonville’s data center within the next two months. But until that happens, he said, “it’s going to be a rough road and a few late nights.”

COMMENT ON THIS ARTICLE

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now