Bull has launched the bullx, a supercomputer that uses blade servers and water cooling to be green and fast, the company said on Tuesday.
Everything in the system has been chosen to suit the demands of high-performance computing customers, according to Fabio Gallo, vice president and general manager of Bull Extreme Computing Solutions.
The number-one criteria for these customers is application performance, but having an energy-efficient system is also becoming increasingly important, Gallo said. Bull is seeking to emerge as the leader in the European HPC space.
The bullx system is based on blade servers. Using blades is the most efficient way of building large-scale parallel systems. They are easier to install and maintain compared to separate servers, according to Gallo. Recently, Cisco Systems Inc. announced it was entering the blade server market with its Unified Computing System B-Series.
Each computing blade in the bullx system is equipped with two quad core Intel Xeon 5500 processors, and up to 18 blades can be installed in one chassis. Blades talk to each other using an integrated Infiniband QDR (Quad Data Rate) switch, which supports up to 40G bps. A rack can then be fitted with six chassis, and the number of racks in a system is unlimited. To get performance of 1 petaflop — which right now is the performance of the largest systems in the world — out of the bullx a customer needs 100 racks, Gallo said.
Besides computing blades there are also so-called accelerator blades, which use graphics processors to speed up floating point calculations by offloading them from the computing blades.
The accelerator blades are not for everyone, because to take advantage of the improvements the cards offer, applications have to be ported and tuned, according to Gallo. But there are quite a lot of organizations that develop their own code that are eager to do the porting and tuning, and Bull can also help them with that, Gallo said.
Another important part of the system is that the racks use water-cooled doors for all configurations of the system. The use of water cooling is key when building energy-efficient data centers, because it’s up to 75 percent more efficient than using air cooling, according to Gallo.
With cooling systems eating up half your data centre dollars, it makes sense to slash your cooling costs any way you can. Some organizations are even investigating the possibility of using cold weather to cool data centres.
The bullx also comes with integrated protection against short bursts of electrical power. That means customers won’t have to use uninterruptible power supplies, which make the data center less energy efficient, Gallo said.
The system runs either Red Hat Enterprise Linux plus its own cluster suite or Windows HPC Server 2008. Bull started pushing the high-performance computing space in 2005 and last year it acquired the German company science + computing. Currently, Bull has approximately 100 customers.
With the bullx it targets production high-performance computing in the government sector, at automotive and aerospace companies, at oil and gas companies and in the financial services area, according to Gallo.
Applications include seismic processing, weather forecasting and crash analysis, he said.
The cost of the bullx will vary from tens of thousands of euros to tens of millions of euros depending on the size of the configuration.