For those of us who switched majors several times in university before settling on the subject that caused us the least amount of grief, there is some comfort in knowing that we are not alone. Not every great doctor spent his or her childhood studying human anatomy. Likewise, not every notable engineer designed bridges out of Lego as a tot.

Such is the case of professor Hugh Couchman of the physics and astronomy department of McMaster University in Hamilton, Ont.

Contrary to what one might imagine of a boy who would grow up to devote his career to astrophysics, the young Couchman did not come from a scientific family, nor did he spend his childhood near Canterbury, England gazing at stars through a telescope.

“I wasn’t interested in astronomy until I went to university,” Couchman remembered. “I took a course in my last year as an undergraduate from Martin Reese, who later became an astronomer royal. He got me enthused about cosmology, which is what I now do.”

Couchman received his undergraduate and graduate degrees from Cambridge University, and his PhD. from the Institute of Astronomy in 1986. He moved to Canada for a postdoctoral fellowship at the Canadian Institute for Theoretical Astrophysics, worked as an assistant professor in Astronomy at the University of Toronto, and moved to the Department of Astronomy at the University of Western Ontario in 1991.

In 1999, Couchman joined the physics and astronomy department at McMaster University, and has been instrumental in making the university a hub for Canadian and international computational scientists and physicists by prompting the recent installation of an AlphaServer SC system – a supercomputer.

Part of the reason that McMaster installed the supercomputer is because of the work that Couchman performs with VIRGO, an international group of astronomers and cosmologists researching structures in the universe.

“Me being a part of this VIRGO collaboration has been useful,” Couchman admitted and went on to describe the purpose of the group.

“The goal of VIRGO is to use supercomputers to model a large scale structure of the universe,” Couchman explained. “Over 10 years ago I wrote a computer code which allowed you to do very big simulations in a given amount of computer resources, so if you have a computer with a given amount of size, a fixed amount of speed, etc., it allowed you to really maximize those resources to really do a large simulation.

“In cosmology that’s what we’re always trying to do, because the range of scales in the universe, from the low scale of a galaxy all the way up to the largest structures that we can see is just enormous,” Couchman continued. “To model that realistically, you need really high resolution simulations. In the mid-1990s or so, parallel computers started coming on the scene. In principle, these allowed you to do much larger simulations, which were closer to reality. And so we took my code, and with the people in VIRGO, we parallelized it and used that new code to do some of the large simulations of the cosmic structure.”

Until the McMaster installation, Couchman had to book time on a supercomputer in either the U.K. or in Munich to carry out his research. The proximity to the new supercomputer is a distinct advantage for Couchman, his colleagues and his students, as they are able to experiment and play with their theories without having to worry about travel or time constraints.

“[Using the supercomputers abroad] was a disadvantage. If you had an idea, you had to apply for time on it, and decide how it would be done,” Couchman said. “That really removes some of the spontaneity. Now, if I’m having a coffee time discussion with some post docs and someone comes up with an idea, you can try it and see if it works. Having that accessibility allows for more experimentation.”

Couchman cites creating a culture around supercomputing as another benefit to having the equipment on hand.

“If you don’t have the local hardware – if you have to travel or use it remotely across the network – it’s much harder to train your students,” Couchman said. “You don’t get as many people interested in using the resource, maybe because they don’t know if they’ll continue to have access to it or for whatever reason. You can end up as an isolated person in an institution using supercomputing. Your students don’t get the training, so you don’t bring your knowledge and skills back to Canada, and so it removes that component of supercomputer use.

“The idea is to try and build a culture of high-performance computing,” Couchman said. “I can say to someone, ‘I don’t know how your problem works,’ and you can work on it together. Soon you start to gather a critical mass of people and start to grow this culture.”

Couchman’s research is concerned with the structure of the post-recombination universe, from about 10 5 years after the big bang to the present, about 10 10 years later.

“Now that we have very large computers available to us, we can realistically simulate many different physical systems at a level of realism that just wasn’t possible before,” Couchman said. “You can now do simulations which really do start to show the properties and elucidate the properties of physical systems.”

Couchman believes that working with supercomputers is a step forward in the evolution of science.

“In high performance computing circles, people often now talk about computing as the third way of understanding nature,” Couchman explained. “You have traditional theory, where people sit down and try to figure out a pure analytic, theoretical description of something. On the other side, you have traditional experiment, where people go into labs and pour stuff in test tubes or look at properties of a piece of matter or something, and set up some kind of experiment. For astronomy, the equivalent of the experiment is observation. You take your telescope and look at the universe. You can’t experiment with a star or a galaxy, but you can at least look at them and see them in different stages of evolution.

“Now, computing allows you to do a much more controlled experiment,” Couchman said. “So you can take a theory and you can put it into a computer in terms of a physical model or a simulation, and you can run that simulation and compare it directly with an experiment in many cases. In some senses, it’s that third way that bridges the other two ways of traditional theory and experiment. It’s the first time that you can really experiment with the universe.”

The Alpha-based supercomputer by Compaq consists of four processor computers that share memory, and are strung together with a high-speed network, which provides the AlphaServer SC with 96 processors in total.

Installed in March, the McMaster supercomputer was provided funding from the Canadian Foundation for Innovation, the government of Ontario and is a part of SHARC-net, a group that is seeking funding for similar installations at the University of Guelph and the University of Western Ontario.

Since the AlphaServer SC’s installation, the department was funded to hire more faculty. Couchman himself has been approached informally by a number of students inquiring about graduate work in relation to the supercomputer, and has taken on one post-installation post doctoral fellow, early proof that this high-tech addition to the department will attract new people.

To find out more about professor Couchman’s work or about the supercomputer itself, go to

Related Download
Create a Unique Selling Proposition for Your Global Market Sponsor: EDC
Create a Unique Selling Proposition for Your Global Market

Register Now