Five easy ways to save power in your data centre

Somtimes the most easy fix to a problem is the most overlooked and conserving power in the data centre is perhaps no different. There are five easy and inexpensive ways for you to improve your data centre’s energy efficiency and reduce power consumption.

“These are easy fixes anyone can do that can save 10 per cent to 15 per cent on energy consumption in the first week,” says Jim Smith, vice president of engineering for Digital Realty Trust, a data center design firm in Dallas.

1. The old hot-aisle, cold-aisle

Everyone knows that a hot-aisle, cold-aisle layout of computer equipment is the best method for assuring that a data center is properly cooled. In this layout, cool air flows through the aisles where it is taken in by the servers and other equipment and the heated air is exhausted out of the back of the system to the hot aisles and to the air conditioner return ducts.

Even though there is more to benefit to be received from hot-aisle, cold-aisle design than any other data center design issue, IT needs to watch for gear that exhausts air differently than out the back. You need ways for handling the hot exhaust from devices that are side- or top-ventilated. Cisco gear, for example, uses side ventilation; in Verari servers and blades and some HP gear, hot air is exhausted through the top of the equipment. To maximize the hot-aisle, cold-aisle configuration, these devices should be placed in their own part of the data centre where all exhaust air can be directed to a hot aisle or intermixed with rear-exhausting equipment and placed in a rack such as from Chatsworth that re-direct side-ventilated exhaust to the back of the cabinet.

Racks should also be positioned perpendicular to air conditioning to allow the unobstructed of heated air.

2. Ready, set, go airflow

Establish the right set points in configuring data centre temperature and humidity. “Most people have the perception that the data center should be cold, says Smith.

“In reality, the data centre should be slightly warm and slightly humid. If you look at the American Society of Heating, Refrigeration and Air Conditioning Engineers [ASHRAE] TC9.9, the mid-point of their range is 74 degrees and 50% humidity,” says Smith.

That’s the ideal environment for data center operation, says Smith.

“Many times you walk into data centres and it’s really cold. The extra energy required to get from the ASHRAE recommended set point down to 67 or 68 degrees; it’s harder to get things colder,” says Smith. “People also think a data center should be cool everywhere. It should only be cool on the intakes of the equipment. We constantly see people trying to cool air where heat is supposed to be.”

The placement of temperature sensors on equipment and computational fluid dynamics studies can help IT determine correct setpoints for temperature/humidy levels. Most equipment is warranted to operate in temperatures up to as much as 95 degrees Fahrenheit. Temperature sensors are available from APC’s NetBotz division.

Sun estimates that IT can reduce their energy costs by at least 4 percent for every upward setpoint change.

3. Get your bypass air right

In raised floor data centres, the job of the floor is to deliver air right where you need it. Bypass air is conditioned air that doesn’t reach the intake of the systems that need it. Anytime air is coming out of the floor where it shouldn’t, you are just expending energy to move that air. Look for holes around conduits or tile cuts or places where equipment used to be and patch them.

4. Getting your tiles right

In a cold aisle, there are perforated tiles or grates that let cold air rise and be directed to the intake of the system gear. Hot aisles shouldn’t have perforated tiles. When you make configuration changes, be sure to also reconfigure your perforated tiles or grates.

5. Blankety-blank it up

Another easy fix is the use of blanking plates. If you have a hot-aisle cold-aisle environment and equipment changes occur, you may leave holes in racks that let hot air from the hot aisle enter the cold aisle.

“We have searched and searched for blanking plates, and in our own data centers we just put up a piece of Plexiglas and velcro for blanking plate openings,” says Smith.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now