HAL

It is a wonder Garry Kasparov ever stood a chance. History’s highest-ranked chess player can compute about three moves per second. Deep Blue, the IBM supercomputer that finally beat him in the spring of 1997, was capable of computing 200 million moves per second. For all of you out there who regularly get thrashed by a $19.99 computer chess game, take heart. It took millions of dollars, a team of dozens and years of development to create a machine capable of beating the best of us.

Today’s super computers are helping scientists tackle some of life’s greatest mysteries and seemingly impossible tasks, from the origins of the universe and mapping the human genome, to accurate weather prediction and sophisticated three-dimensional nuclear tests.

As little as a decade ago, the idea that scientists would be able to create an interactive 3-D model of a human cell was commendable, yet farfetched. Creating a drug with the help of computers and knowing exactly how it would interact with a myriad of proteins long before it was ever manufactured and tested was a dream. It is still years away, but now within reach.

“Five years ago this would have been ridiculous for me to even suggest,” said Addison Snell, product manager at SGI in Mountain View, Calif. “Today we are not quite doing it yet but it is not ridiculous to talk about.”

There are still many Holy Grails to find, in fields as diverse as astronomy and biology, but given Moore’s Law, some ingenuity and a touch of luck, many will be discovered in the near future.

To put all of today’s computational power in perspective, a decent Pentium lll desktop has more power than all of the computers that existed in 1950 while today’s true supercomputers are millions of times faster than the fastest desktops on the market.

The granddaddy of them all is ASCI White, the IBM leviathan at the Lawrence Livermore National Laboratory in Livermore, Calif. It is capable of computing more than 12 trillion floating points operations per second (12 teraflops), making it about three times faster than its nearest competitor. But this extreme power and speed comes at a cost. The US$100 million price tag is enough to send any corporate CFO into nightmares of financial ruin. Consequently, the vast majority of the world’s supercomputers are government owned and run. In fact, of the top 50, only three are in the hands of industry.

is that computer super?

Defining precisely what is and what isn’t a supercomputer is more an issue of semantics than some finely tuned test it has to pass.

Many say for a machine to be called a supercomputer it has to be one listed by top500.org (www.top500.org/list/2001/06/).

“I say if you make that list, I guess you are a supercomputer and if you don’t you are not,” said Peter Ungaro, vice-president of high performance computing with IBM Corp., in Washington, D.C.

The list is maintained by the University of Tennessee in Knoxville and the University of Mannheim in Mannheim, Germany, and is about as impartial as you will find in the tightly-contested world of supercomputing.

But even this might open up a can of worms as computer clusters (grids or farms as they are also called) and distributed computing can potentially enter the fray. About 20 per cent of the top 500 are clusters with the highest ranked 30th.

Many individuals refer to SETI@Home, the distributed computing sensation searching for extraterrestrial life, as the world’s most powerful supercomputer, and in some respects it is. With several million users allowing their idle CPU cycles to process radio-telescope data, the combined computational power of all the SETI computers is indeed greater than ASCI White’s.

Yet the SETI model cannot be used for many computational needs. Some results such as weather forecasting need data processed very quickly, while others such as nuclear weapons testing are too entrenched in national security to ever venture into the public domain.

faster, faster, faster

The reality is supercomputers themselves don’t do much more than they did a decade ago. But they do it faster – a whole hell of a lot faster.

David Schwoegler, spokesperson at the Livermore Lab, said they once did a speculative calculation back in 1994 with the fastest machine they had at the time.

“A weapon goes from button to bang in about nine milliseconds…that simulation in two dimensions using the Cray…would have run 6,000 years, in three dimension, as we are supposed to do, would have run 60,000 years,” he explained.

When something is going to take that long you don’t even bother thinking about designing the simulation.

Today the lab’s maximum run time is about four to eight weeks, though they still have to make some approximations since Livermore needs a computational threshold of 100 teraflops to do the job properly. Today they have 12.3.

ASCI White has a size reminiscent of days of yore. The 8,192 processor behemoth weighs in at 106 tonnes. The 512 shared memory nodes have an amazing 6.2TB of memory and 160TB of storage on 7,000 disk drives. It covers an area larger than two basketball courts and, needless to say, has a tad of security around it.

The last nuclear tests performed by the U.S. were in 1992. As much as many of us would love to live in a world free of nuclear weapons, the likelihood of that happening anytime soon is almost nil. Since weapons’ parts have finite lifetimes, there is a need to check the viability of replacement parts. Years ago you built a bomb, stuck some sensors around it, dropped it in the ground and monitored the explosion. Today everything must be done with computer simulations, and given the alternative, it’s not a bad solution.

come rain or shine

Canadians seem to have an innate need to complain about the weather and terrible forecasting seems to top the list.

Well take note, weather forecasting is far better than it was even a decade ago (no this is not a lie) and much of the increased accuracy is due to the use of supercomputers. The biggest supercomputer in Canada is run by the National Meteorological Services of Canada (MSC) in Dorval, Que. Though nowhere near as powerful as ASCI White, the computer does make the global top 100.

“The reason for using really powerful computers in weather [forecasting] is that if you want to make forecasts for the public across Canada, you have to generate the forecasts within a reasonable amount of time,” said Jean-Guy Desmarais, director of the development branch with MSC.

“Producing a 24-hour forecast is totally useless if it takes you 24 hours to generate the forecast.”

MSC uses its supercomputer to calculate the extremely complex numerical models used for weather prediction. The atmosphere over Canada is sliced into small boxes – 1,200 by 600 points in the horizontal and 45 points in the vertical. In each area six atmospheric variables are measured.

“The number of calculations required to produce a 48-hour forecast is many, many billions of floating points operations,” he explained.

All of this increased power, coupled with better numerical models has led to more accurate and longer term forecasts for Canadians.

“Every 10 years we improve the predictability of the atmosphere by about 24 hours,” he said.

“The forecast for the third day today is as good as the forecast for the second day was ten years ago. This gives you a (good) measure of the impact of computers,” Desmarais added.

Bigger hurdles yet

While long-term weather forecasts require more computational power than three-dimensional nuclear tests, their requirements pale in comparison to the needs of bioinformatics.

The most well known task in this arena is the mapping of the human genome – or as Chris Hogue, CIO of MDS Proteomics Inc. in Toronto, likes to call it – the master parts list. Supercomputers are needed to sift through gigantic databases in order to help create the map. Scientists are about 90 to 95 per cent done and should have the complete parts list in a few years. This will essentially act as a starting point for the study of human proteins. These building blocks of life offer a far more complex challenge.

Proteins start as strings of amino acids and quickly fold into three dimensional shapes; their functions are a result of their forms. They can become enzymes to help digest food or hormones to dictate body functions. Understanding how they fold is important to understanding how defects can occur and eventually lead to diseases such as Alzheimer’s, cystic fibrosis and even cancer.

“There are an awful lot of factors that go into determining how a protein folds, it is not just the sequence of the protein, it is also the physical conditions, the pH and the ion concentration,” said Christopher Porter director of operations for the genome database at the Ontario Centre for Genomic Computing, in Toronto. “It is very complex.”

One protein can fold trillions and trillions of different ways. To figure out how a protein potentially folds without supercomputers would be fruitless.

“Once you know the three dimensional structure of a protein, you can then use information about that fold to figure out what it does and how you might intercept it or stop it from doing something it is not supposed to be doing,” Hogue said.

Drugs could be created which address specific protein abnormalities. One such drug, Gleevec, was designed as a specific inhibitor of a type of protein involved in leukaemia.

“That drug was precisely designed with the help of computers…to stop that one protein kinase,” Hogue explained.

This is just the beginning.

“The Holy Grail is a comprehensive understanding of all of the proteins active in tissue, where they are located in the cell and their function,” said Paul Kearney, director of bioinformatics at Caprion Pharmaceuticals Inc. in Montreal.

“To have a model of how a cell works and to be able to probe it electronically…would just be incredible, although I think that is still a long way off,” Porter added.

There is a certain undeniable irony and joy in humans creating machines designed to help us overcome some of the inherent frailties of our own physiology.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now