Sun estimates the Broomfield facility will not only save $1 million in electricity costs and 11,000 metric tons of CO2 per year, but reduce the company’s overall carbon footprint in the U.S. by six per cent.
Based on Sun’s Pod Architecture, the building’s scalable, modular design achieves 66 per cent footprint compression by reducing the former 496,000 square foot StorageTek campus to 126,000 square feet.
“We’ve shown that you can build an efficient data centre in an office-type building,” said Mark Monroe, director of sustainable computing at Sun. This sets an example for enterprises, especially those based in major cities that often need to take advantage of the space that’s available in their building, he explained.
Raised floor space was kept to a minimum, reduced from 165,000 square feet to less than 700 square feet. “We have one little set of systems and a couple of mainframes that we use for testing of StorageTek equipment that were required to be on raised floor…everything else is on slab,” said Monroe.
“Raised floor has to be reinforced to be able to support these high density racks, blade servers and high density storage,” Monroe explained. “Without that raised floor, we’re able to pack things into this building a lot more.”
A major hardware replacement program, which included consolidating 63 servers and 30 direct attached storage devices to two Sun servers, also contributed to space compression.
Broomfield’s non-chemical water treatment system, expected to save approximately 675,000 gallons of water per year, is new to Sun’s data centre designs. “We added it to this data center in particular because the Colorado area is a high plains environment. There’s not a lot of water here,” said Monroe.
The system not only saves water, but also eliminates chemical pollution inside the water used. “When a cooling system uses water, we have to blow down the system, which means take water out of the system and replenish it with new water. We have to do that less often with the non-chemical water treatment system,” Monroe explained.
Future plans include grey water diversion. “The water is eventually blown out of the system and doesn’t have the chemicals in it, so we can reuse it for applications like irrigation of the facility,” he said.
Broomfield is also the first to implement Liebert’s new advanced XD cooling system, which features dynamic cooling controls. The system supports rack loads up to 30kW and a chiller system 24 per cent more efficient than American Society of Heating, Refrigerating and Air-Conditioning Engineers standards.
The data centre’s green architecture is further supported by flywheel uninterrupted power supply (UPS) that eliminates lead and chemical waste by removing the need for batteries, explains Sun.
The biggest lesson IT practitioners can learn from Broomfield is the savings in energy costs, said Aaron Hay, research analyst at Info-Tech Research Group. “In terms of energy efficiency, it goes to show you the amount of savings that are, in fact, available depending on the kinds of technologies that you choose to go with,” he said.
“People don’t have opportunities to engage in a lot of capital spending right now,” said Hay. “To do a data centre retrofit, you’re looking at a lot of upfront costs and the longer term cost saving to pay off those costs, but with consolidation – improving the efficiency of your existing data centre without having to do a lot of retrofit – you’re probably going to be looking at a series of smaller upfront costs that may not offer you the same jump in efficiency that a big project like this would offer, but will still offer you significant energy savings.”
A new Info-Tech survey asked 1,000 IT professionals worldwide where their organization was with respect to implementing an existing server room/data centre upgrade.
“Twenty two per cent are piloting or implementing this project now. There are another 29 per cent who are planning to do so in the next 12 months. So you’re looking at a window of up to 50 per cent of practitioners who are either undertaking this as we speak or are going to get started very quickly,” said Hay.
Facilities, IT and business really work on three different time scales and it’s important to get those time scales to line up, advised Monroe. Facilities think in terms of decades (i.e., how long the building will last), IT systems last in the single digits (three to six years) and business needs can change overnight or week by week, he explained.
“You need a building that will last for 10 or 15 years,” said Monroe. “The scalability piece of that is very important because we know that the next generation of computers will be twice as fast and require twice as much power and be twice as dense in terms of the cooling footprint.”
Sun will be the first system vendor to endorse the best practices and efficient data centre designs discussed in the European Union’s Code of Conduct for data centres released last November, said Monroe. While voluntary, code-of-conduct documents are often used as the basis for legislation or regulations in the EU states, he pointed out.