Blue Gene to bridge petaflop performance

A faster supercomputer – 1,000 times more powerful than Deep Blue and about 2 million times faster than today’s desktop PCs – is the lofty goal of a new research project at IBM Corp.

Nicknamed “Blue Gene” because it will be used primarily to study complex human proteins, the computer will be capable of more than one quadrillion operations per second ( one petaflop), according to IBM.

Following Moore’s Law, a petaflop-scale computer is still 15 years in the future, but IBM plans to do it in about five with Blue Gene. This will be possible thanks to a new approach to computer design and a radically simplified architecture, said Dr. Dennis Newns, a researcher specializing in molecular dynamics at IBM’s T.J. Watson Research Center in Yorktown Heights, N.Y.

“There are really two ways in which this architecture is spectacularly simple. One is that the instruction set used in the processes is a very short one – only 15 instructions,” he said.

“The second way is that there is only one type of chip in the computer, apart from power supplies. So it’s just a large array of one single kind of chip, and these chips are connected by high-bandwidth links.”

According to Newns, Blue Gene uses a Processor in Memory (PIN) design, in which the same chip houses both the processors and the memory. There will be about 32,000 chips used in the computer and “since there are 30 processors per chip – that’s a million processors all together,” he said.

“So this type of design, with the memory on the chip, eliminates a lot of the complication that you have in a regular computer architecture. It is possible to carry out all the functions that a conventional computer can carry out, just with one type of chip.”

The computer’s architecture has been dubbed SMASH, which stands for simple, many and self-healing, he said.

” We plan that the program will identify [defective chips] and simply route them out of the system during the course of the run.”

IBM’s US$100 million initiative is comprised of three main groups of researchers: programmers; molecular dynamics specialists, who study scientific modelling and coding required to describe movement of atoms; and geneticists, Newns said.

Blue Gene will be initially used to model the folding of human proteins, which is expected to ultimately give medical researchers better understanding of gene mutations and diseases.

“Up to now, the mechanics of protein folding has been very difficult to research because there was no computer anywhere near powerful enough to follow the process,” Newns said.

Protein folding occurs on an atomic level and only takes about a millisecond, he said. “So only by modelling it on a computer can you understand the dynamics of these kinds of processes.”

Today, computer simulation of folded proteins is being done, but the process usually takes a year or more for a typical protein simulation. At the current rate, understanding the full range of human proteins and their functions would take more than a century, Newns said.

Dr. Leo Spyracopoulos, assistant professor of Biology at the University of Alberta in Edmonton, explained why genetic research is very compute-intensive, both from a storage and a speed perspective.

“The typical protein might have thousands of atoms. So the computer has to keep track of all these coordinates. They all have velocities, they all have masses, they are all surrounded by other charges. So typically, it’s a really difficult problem to try to tackle.”

He said a computer robust enough to eventually map the primary sequence of proteins in order to foresee what kind of structure they would take “would be an extremely powerful tool.”

IBM is hoping that concepts learned with the research project will also improve overall future computer performance and architectures.

Rich Partridge, vice-president of parallel open systems hardware with Port Chester, N.Y.- based D.H. Brown Associates Inc., said one reason Blue Gene will be “screamingly fast” compared to traditional computers is because it doesn’t make any operational trade-offs for speed.

“The fact that [IBM] offloads certain general purpose functions to the front end is not strange and unnatural. In fact, it’s likely to be the way future platforms [may be built],” he said.

IBM’s Newns agrees. “People have essentially operated with [computer] architectures roughly constant, and simply have gained a free ride on the improvement of every generation of chips. Whereas now we’re foreseeing, somewhere in the 10-year timescale perhaps, a future in which this may no longer be the case.”

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Previous article
Next article

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now