Patience pays off for utility

Some IT projects take a long time to come to fruition. The campaign to get Newfoundland Power to switch to a virtualized environment for its office, ERP and billing systems took half a decade.

However, the principals involved believe the constant work -– it might be likened to the drip, drip, drip of pressure on managers — was worth the effort.

“Several of us in the technology group were preaching the benefits of this five years ago,” recalls John Pope, a project lead in the utility’s Information Services’ applications and solutions division and one of the leaders of the crusade. “It wasn’t an easy sell.”

Perhaps that’s understandable. As a utility responsible for generating and keeping the power on for some 227,000 people in the province, it’s careful about its risks.

“We’re a very traditional electrical utility,” says Pope, “so anytime we look at newer technologies we tend to take our time and go very slow.”

But such deliberation has a price. By 2005, the utility had a data centre in St. John’s with some 120 Hewlett-Packard or Compaq servers running its applications at no more than 30 per cent capacity.

“We came from an era when most of the vendors dictated that everything ran on its own,’ Pope explained.

All but about six applications are Windows-based, including Microsoft’s Exchange for e-mail and Great Plains for enterprise resource management, and Invensys’ Avantis for asset management. However, the company also does a lot of in-house development, such as creating an application that calculates electrical load span, and rewriting the customer information and billing system that runs on Open VMS.

While keeping the number of vendors down had its advantages, there were disadvantages as well. In addition to the inefficiency of the use of the servers, it usually took at least 24 hours for the IT department to prepare a server for the developers to test a new application on, time taken away from other work.

Just as important, because the department didn’t have a data replication system it took the department a full day to do a disaster recovery test, time that also included a loss of new data. “Our disaster recovery was killing us,” Pope says.

About two years ago, as industry standards on virtualization began to emerge, the department began thinking about it more seriously.

Halfway through 2005, a three-person steering committee made up of the department chief and the directors of the solutions and infrastructure groups agreed to let Pope and infrastructure specialist Chris Seary form a team to explore the options.

“We had done a lot of reading, a lot of research on the ‘Net, so we knew very well what they technology could do,” Pope said.

Still, their bosses “caused us to slow down – and rightly so – to get the confidence this new technology was capable of doing for us what we said it could do.” In the fall of 2005 tests were done running a single application on Microsoft’s Virtual Server and VMware’s ESX.

The goal, team member Melvin Goodyear said, was to show that virtualization could simultaneously run production and test environments, would increase high availability of core applications and slash the time of disaster recovery. Initially, the tests showed little difference in performance between Virtual Server and ESX, although the latter needed a more powerful server. “We were very close go going down the VMware road,” says Pope. However, it was more expensive.

After being briefed on Microsoft’s product roadmap, which outlined management capabilities about to be included in the software, the utility chose Virtual Server. At the same time they picked replication software from Double-Take Software for backup.

Early last year they began to build the virtualized system. “We quickly realized having a strong host system was going to be important for us to provide the flexibility we said we could,” Pope said. So they invested in an HP ProLiant server with four AMD Opteron CPUs, 32GB of memory and lots of storage. (A second with four dual-core processors was added later.) Data transfer was accomplished with HP’s Virtual Machine Management Pack. By last June they were ready to go live. The careful, deliberate approach had paid off: the first disaster recovery drill took a mere 30 minutes, with no data loss. Similarly, it takes only a half an hour to set up a virtual server for the application developers.

Since then the department has been slowly winnowing down the number of servers, retiring older, less energy-efficient boxes. They’re down to 100 now, and Pope hopes another dozen will be taken off-line this year.

In total there are four virtual hosts running 50 guest operating systems. Not all the applications have been switched, however. Microsoft says the current version of Great Plains (now called Dynamics GP) can’t be virtualized, which is tying up five of the utility’s servers. Pope said Microsoft has promised the upcoming version of the application will be able to be virtualized.

Still, the IS department sings the hosannas of virtualization. “We will not set up a physical box for a development or test environment, even if it’s an evaluation of a software product,” Pope says. “We will push the vendor to allow us to do it in virtual.”

A side note: Ironically, after having been deliberate for so long, had the department waited a little longer it could have had Virtual Server for nothing – it’s been free since December, 2005 with Release 2. A hypervisor version – to be called Windows Server Virtualization — will be included as part of the upcoming Windows Server 2008.

WinServer Virtualization doesn’t need a host operating system, and therefore can connect directly with certain virtualization-tuned CPUs from Intel and AMD. That should give better performance, said Hilary Wittmann, a senior product manager at Microsoft Canada.

However, WinServer Virtualization will come in the form of a beta release when Windows Server 2008 comes out. The full version will be available for download to registered users 180 days later.

Even then, it will not have all of the features Microsoft promised a year ago. To meet deadlines, Microsoft has had to drop the “hot-add” capability, which allows administrators to move virtual machines dynamically. That capability will have to wait for a future version of the software, Wittmann said.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Howard Solomon
Howard Solomon
Currently a freelance writer, I'm the former editor of ITWorldCanada.com and Computing Canada. An IT journalist since 1997, I've written for several of ITWC's sister publications including ITBusiness.ca and Computer Dealer News. Before that I was a staff reporter at the Calgary Herald and the Brampton (Ont.) Daily Times. I can be reached at hsolomon [@] soloreporter.com

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now