Uni supercomputer models Antarctica

Simulating the Antarctic environment may involve a wealth of scientific know-how, but for the team involved with the Tasmania Partnership for Advanced Computing (TPAC), it’s also about having the most advanced and steadfast computing power at your fingertips.

TPAC is an expertise centre developed to produce numerical models for high-performance computers with reference to the southern ocean, atmosphere, Antarctic Ice Sheet and the environment. As well as the University of Tasmania, TPAC is supported by the Antarctic Co-operative Research Centre, CSIRO, Australian Antarctic Division (AAD), Bureau of Meteorology and various Australian marine laboratories.

To boost the international role its partners play in Antarctic research, TPAC recently announced the opening of a A$3.3 million (US$1.9 million) high-performance computing facility based at the University of Tasmania. The facility consists of a supercomputer connected to a 20TB robotic silo system, plus a 6 per cent share of the Australian Partners in Advanced Computing (APAC) national facility located at the Australian National University, equating to around 1teraflop of computer speed and 300TB of storage.

Complementing the launch of the new facility, the University of Tasmania team upgraded its original Cray Inc. supercomputing system with a new 24-processor Silicon Graphics Inc. (SGI) Origin 3000 supercomputer. The Origin 3000 is capable of holding up to 128 processors and up to 256GB of memory per compute rack.

Valued at around A$1 million, the supercomputer was purchased through funds received via a $673,000 federal government grant, as well as donations from the University of Tasmania’s partners.

According to associate Professor Nathan Bindoff, physical oceanographer for Antarctic CRC and TPAC director, the computing power upgrade gave TPAC partners the means of increasing the amount of research they could conduct on Antarctica by shortening the time it took to generate and interpret information. Studies using the supercomputer incorporate polar research, climate modelling such as atmosphere, ocean and ice sheet simulation, mathematical modelling of magnetic resonance imaging, computation chemistry, and engineering fluid dynamics.

For example, Bindoff said implementing the new supercomputer has allowed researchers to literally halve the time it takes to complete some of their long-term 3D numerical model research projects, which use the formulation translation package Fortran.

“Speed ups were significant, from two years to one year,” he said. “It [the supercomputer] means being able to do state-of-the-art work and do it quickly enough to respond to the questions at hand.”

As a whole, the supercomputer now fuels projects being undertaken by more than 40 research scientists from the Antarctic CRC, the AAD, and around 26 PhD students at the University of Tasmania.

Initially, the university ran its research on a Cray J916/8 supercomputer, upgrading to an SV model in 1998-99. While capable of conducting the work, the upgraded Cray system had become slower relative to the improvements being made during the three-year period, Bindoff said.

Alongside computing power, Bindoff said the team devised three criteria for choosing the most appropriate replacement for the Cray system: interconnect capability between the clustered processors, interoperability with the team’s existing infrastructure and applications, and compatibility with commercial software solutions used by its various scientists and researchers.

Although the new computer’s processing power is limited to 500MHz per CPU, the hardware boasted of the most instantaneous exchange between the CPUs, he said.

“We were looking at how the CPUs talked to each other. Clusters can be PCs simply strung together with Ethernet cable, which is an inadequate solution to do the sorts of problems and research we do,” Bindoff said.

The continuous communications between CPUs in the SGI Origin 3000 means that programs run constantly across all the CPUs – an important point, considering some of the modelling research undertaken using the supercomputer take years of computing power, but exchanges information between CPUs every few seconds, Bindoff said.

TPAC’s robotics silo storage system, which houses 10TB of mirrored modelling data on Antarctica, also had to be integrated with the new supercomputing facility. Of the vendors offering supercomputing solutions, SGI’s proved not only the simplest for linking the silo system to the facility, but also provided the best scalability tools for future upgrades, Bindoff said. Other products from Sun Microsystems Inc., Cray and Compaq Computer Corp. were also considered before the group decided upon SGI’s supercomputing system.

SGI’s supercomputing solution was also more closely aligned with everyday Unix, which the group uses to conduct its modelling research.

Overall, Bindoff said the computer provides a four-fold increase in computer power, as well as increase in RAM from the 4GB previously available to 24GB. Additionally, the team’s storage space has also risen from several hundred gigabytes on the Cray system, to 4TB of storage on the new SGI hardware.