Institutions unveil high-performance computing

As innovation helps keep the Canadian economy competitive, and innovation thrives on research and development, IT decision-makers are often hard pressed for faster, larger computing resources that are capable of supporting massive simultaneous workloads.

Recently, the Canada Foundation for Innovation (CFI) and the Natural Sciences and Engineering Research Council of Canada (NSERC) announced the launch of a pan-Canadian research network called High Performance Computing (HPC) to deliver new tools for high-performance computing research.

The $88 million project unites seven regional computing consortia, such as ACEnet, under the banner of one national organization called Compute/Calcul Canada. The new network, primarily for university researchers with large computational needs, will help them share data on research projects throughout the country.

HPC will serve over 60 universities and over 6,000 researchers across Canada. The CFI says the HPC will include multiple dedicated light paths with a combined bandwidth of at least 10Gbps. The network will also include a total data storage capacity of 6.2 petabytes on a national scale.

“We felt it might serve everybody’s best interests if we brought them together and developed a proposal for [a single] national HPC network,” says Eliot Phillipson, president and CEO of the CFI.

Phillipson says each of the regional consortia will now act to differentiate and specialize in a specific area of computing needs.

“It will help [researchers] by providing access to the appropriate hardware and software, as well as access to the appropriate technical people who will advise them. The CFI has been considering the importance of high-performance computing because it will play [a major role] in Canada’s future scientific endeavours and research,” he says.

One university researcher suggests the combined HPC resources promise opportunities vastly greater in scope than previously possible.

“We’re purchasing an array of tier-two systems that will be distributed over the Canadian landscape, with some systems reaching a theoretical maximum of hundreds of teraflops,” says Dick Pelltier, professor of physics at the University of Toronto.

“We will be installing capacity clusters with thousands of CPUs with much higher bandwidth interconnects and a vector computing platform [at the University of Toronto] for the most highly coupled jobs, which is required to run for climate simulations.”

Phillipson says the HPC became operational not only for high-performance research activities, but also to act as one of 10 international data storage centres for activities related to the Swiss-based European Organization for Nuclear Research, or CERN.

“The volume of data coming out of CERN would overwhelm any one data centre. Now the data is streamed to Canada’s major particle physics centre at the University of British Columbia. Through the HPC, it can be streamed across the country to all of the researchers involved,” he says.

Regional consortia involved in the HPC will be serving researchers in a variety of network capacities when it comes to computing needs, says Ken Edgecombe, executive director of the High Performance Computing Virtual Laboratory (HPCVL), one of the consortia based at Queen’s University in Kingston, Ont.

“It’s not just about hardware; part of it is the support network which involves experts and giving user support so we will be able to discern what kind of architecture users need. That should help them with their research and speeding it up,” says Edgecombe.

A public-private partnership between Sun Microsystems Inc., the CFI, NSERC and other levels of government, the HPCVL will focus on one specific computing demand on the network, called Shared Memory Computing. The HPCVL will concentrate its resources to serve a specific researcher’s need through the new HPC.

Pelltier says he and the other HPC supporters are hoping for a refresh cycle on the computing resources every three to five years. The refresh cycle, as well as keeping the HPC network going, remains dependant on continual federal government funding to the CFI.

“We’re hoping this national platform will operate in perpetuity. The arguments are very strong in terms of Canadian competitiveness and productivity,” he says.

“I work on long-range climate modeling. These are huge problems in scientific computation and the current equipment we have is not up to the task. The new systems will be helpful in all of this work.”

QuickLink: 078640

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Previous article
Next article

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now