IBM pitches a supercomputing Fastball

IBM Corp. Thursday said it had developed technology to speed up the way large computer networks access and share information.

Under a project code-named “Fastball,” IBM’s ASC Purple supercomputer has been able to achieve 102G bytes per second of sustained read-and-write performance to a single file — the equivalent of downloading 25,000 songs in a second over the Internet, according to IBM.

IBM’s General Parallel File System (GPFS) software was used to manage the transfer of data between thousands of processors and disk storage devices. IBM said it had to enhance the software in several areas to handle such fast data rates.

For example, it employed new fencing techniques to prevent individual hardware failures from causing the overall system to fail and added new capabilities to orchestrate flow control between all of the different hardware components in the system.

“If they all go real fast at the same time you get a traffic jam and performance goes down,” said Chris Maher, director of high-performance computing development for IBM’s Systems and Technology Group.

ASC Purple, the world’s third most powerful supercomputer according to the Top500 list, is housed at Lawrence Livermore National Laboratory (LLNL). The Fastball project capabilities were demonstrated at LLNL. IBM supplied the computer to the U.S. Department of Energy and LLNL for use in nuclear weapons research.

The Fastball project combined IBM servers, a high-performance computing switch network and storage subsystems tied together through the enhanced version of the GPFS software. IBM used 416 individual storage controllers combined with 104 Power-based eServer p575 nodes.

In the Fastball demonstration, 1,000 clients requested a single file at the same time. Through virtualization techniques, the software then spread that file across hundreds of disk drives. The resulting file system was 1.6 petabytes in size.

Researchers for the project say this kind of computing could be applied to different applications.

“You can imagine the kind of problems you can solve with this, like a tsunami warning device that would scrutinize huge amounts of information from the ocean and then analyze that quite quickly, or for homeland security applications, where you need to scan images of people and match those images against large databases,” Maher said.

Other applications could include medical research and online gaming, Maher said.

A future area of focus for the researchers is developing ways to match appropriate storage resources to data automatically as data is generated based on predefined policies, Maher said.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now