TORONTO—Exascale computing, or one million trillion calculations per second, is the next barrier to beat in the realm of high-performance computing (HPC). But one expert said this target is of little good if the enormous amount of data generated is not used to its fullest value.
Eng Lim Goh, chief technology officer with Fremont, Calif.-based HPC vendor Silicon Graphics Inc., said that even at the current state of petascale HPC, or one thousand trillion calculations per second, much of the data produced is not utilized. One organization he knows leaves 95 per cent of the generated data “on the floor, untouched because they can’t deal with the data produced,” said Goh.
“We’re still not dealing with it today even with Petascale,” said Goh to the audience during the five-day HPCS 2010 conference.
HPC is used to run massive and intensive computations for things like simulating the path of a tornado and where it will touchdown in an effort to save lives. Register for our Webcast: Reduce Total Cost of Ownership with Third Party IT Maintenance
But complicating the matter is that data is generated not just from HPC but also from the myriad sensors such as satellites and cameras. Goh noted that the NASA Solar Dynamics Observatory alone generates 1.4 terabytes of data daily.
With so much data generated, Goh said storage systems quickly fill up and, often, data analysts will give the data a mere cursory glance before throwing it out to avoid the cost of storing it.
Goh said the data analysts themselves often work with an abstract goal and aren’t even sure what they are looking to discover.