Why personal HPC systems haven

While personal high performance computing systems have grown in profile recently, enterprise units are still facing barriers to adopt the powerful systems, according to industry observers.

Last year, the supercomputing industry made waves after the launch of IBM Corp.’s US$120-million RoadRunner system, a computer powered by 6,000 dual-core AMD Opteron CPUs. Incredibly expensive machines like the RoadRunner are capable of a million billion calculations per second, but their use is often limited to academic researchers.

But while these headline-grabbing systems remain in the spotlight, the personal desktop supercomputer has been quietly gaining traction over the last few years. The global workgroup supercomputer market (machines that sell for less than $100,000) was a $1.96-billion industry in 2008, according to IDC Corp., with that number expected to reach $2.7 billion by 2013 (all figures U.S.).

Steve Conway, a research analyst with IDC’s High Performance Computing group, said these mini-supercomputers are now clustering multiple graphics processing units together with CPUs to deliver increased power for engineering firms, pharmaceutical research shops, auto-makers and a wide variety of product design units.

“One example is a small engineering services firm that makes parts for an automotive company,” said Conway. “They’ve been working with desktops forever, but their problems are too big to fit on a PC now, so suddenly they are moving up to a small cluster of computing power.”

But despite the increasing sales and glowing reviews from earlier personal supercomputing adopters, the market might not be performing up to its true potential, with only companies from the energy sector truly getting behind the concept on a wide scale, according to Conway.

Charles King, principal analyst with Haywood, Calif.-based Pund-IT Research Inc., said that despite their attractive price-performance ratio, HPC systems are expensive and complex to implement. Manufacturing and design shops that are making their first foray into the supercomputing world don’t have the in-house software development or application expertise to take advantage of the added processing power, he said.

Conway added that vendors such as IBM and HP Co. will have to continue to work hard to make these workgroup supercomputers easier to purchase, easier to use, and easier to maintain.

“Companies have to be able to use them without being a systems integrator,” he said.

Microsoft Corp. has been trying to address this issue with its upcoming release of Windows HPC Server 2008 R2, according to King. With the vast majority of supercomputing installations running on Linux or some kind of variant of Linux, most supercomputer shops either write their own code or tweak used software to take advantage of their installations.

“Microsoft sees an opportunity for capturing the performance benefits of supercomputing by working with partners to develop pretty much out-of-the-box applications for smaller companies to get on-board,” said King.

Another barrier, he said, comes at the application level, as companies will not invest in the computing power if their applications aren’t taking full advantage of it.

“If you look back at the Sony PlayStation 3, for example, it was a nine-core processor,” said King, referring to the machine’s eight graphics processors and one controller core. “It’s arguable that a lot of the trouble that the PS3 had in gaining market traction was that it had a next-generation hardware platform that app developers were having trouble figuring out.”

“If software developers can’t figure out how to gain maximum performance from computing power, it’s like having a car with a thousand horsepower engine, but a speed regulator on the gas pedal,” he added.

Conway said that application developers will catch up as more companies show interest in moving to personal supercomputers. “The more money involved, the more attractive it will be to software developers.”

And according to Russ Conwath, senior research analyst with London, Ont.-based Info-Tech Research Group Ltd., many organizations will take interest in the next few years.

“Anybody that does existing processing in a PC environment, but potentially wants more performance, will be interested,” he said, adding that financial research firms and weather analysis companies could be considered perfect candidates for this type of computing.

And for IT managers, the data centre impact will be no different than adding another box to your computing environment, Conwath said, other than the need for some extra security precautions.

King added that because a lot of computing performance is built into the software, once app developers catch up, the hardware refresh cycle on personal supercomputers will be much longer than traditional PCs.

“Rather than upgrading to this year’s processor or server, you can just add additional servers to the cluster and dial it up that way,” he said.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now