Site icon IT World Canada

Canada clears up its cloud strategy

Government is overspending on IT, yet productivity growth has declined – and it’s reaching a crisis point during this period of fiscal restraint. As Canada’s CIO looks to move away from infrastructure and focus on developing services instead, new opportunities such as cloud computing are coming to the fore.

Realize the Future with HP

Toronto will soon Realize the Future, a series of special events on what the future will look like, and how you can start now! Join HP on February 24th 2010 at Toronto’s Allstream Center

Register Now

The business case for cloud computing comes down to underutilized capacity, said Jirka Danek, CTO of Public Works and Government Services Canada (PWGSC), during the Public Sector CIO Forum hosted by Insight Information in Toronto last month.
There are 325,000 employees in federal government, 140 departments (all with their own CIO), 124 networks and 144 data centres across the country that he knows of. And 120,000 Wintel and Unix servers use less than 10 per cent of their capacity. “To make matters worse, 40 per cent of IT professionals are eligible for retirement in next five years,” he said. “So we have to leverage the private sector a lot more.”
The Treasury Board of Canada has obtained agreement across departments on the language and definitions for cloud computing and received endorsement for the Government of Canada’s cloud computing roadmap – one that can be validated with countries such as the U.S., the U.K., Australia and New Zealand.
Essentially, cloud computing is a model for enabling convenient on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction. Five essential cloud characteristics include on-demand self-service, ubiquitous network access, resource pooling, rapid elasticity and measured service. “Departments don’t have to buy our services,” said Danek. “We have to demonstrate cost of ownership.”
Cloud computing service models range from software as a service (SaaS), to platform as a service (PaaS) and infrastructure as a service (IaaS). Deployment models include public clouds (available to the general public, like Amazon), private clouds (operated solely for an organization), hybrid clouds (two or more clouds, unique but bound together by standardized or proprietary technology that enables data and application portability) and community clouds (shared by several organizations that have shared concerns).
So far, they’ve finalized a security architecture, said Danek, and are now offering a community cloud for pay, pension, CampusDirect, GC Intranet and Canada.gc.ca. Short-term goals are to use SaaS for internal collaboration (such as GCPedia, GCConnex and GCForum), PaaS for commoditized Web hosting and IaaS for virtual storage and computing services. Long term, SaaS will be used for virtual offices, collaboration and federated management and directories. PaaS will be used for cloud-based application and database hosting, on-demand services and process automation, and IaaS will be used for departmental private and public cloud peering.
Cloud computing is easy and fast to deploy, you pay only for what you use and you get the latest functionality in a more standardized IT environment. On the other hand, challenges – at least perceived ones – include security, performance, availability and regulatory requirements. “The Patriot Act is a key concern,” said Danek. But there’s a need to address challenges, because they’re keeping government from innovation.
There’s a lot of hype around cloud and some believe it’s the way of the future. “It’s growing, and it’s growing quickly, but it’s important to remember even in the future people will continue to own infrastructure,” said Duncan Stewart, director of research for technology, media and telecommunications with Deloitte Canada Research. “Something like 80 per cent of all the world’s financial transactions still run COBOL.”
The most commonly stated risk is security, but while this remains a high-profile issue, he’s not sure there’s anything inherently insecure about the cloud. The same can be said about reliability. “We think this is probably more perceptual than reality,” he said. Twitter, in the beginning, was pure cloud, but when it experienced failures, it was because the software wasn’t working properly – not because of the cloud. While security and reliability appear not to be long-term impediments, you’ve got to resolve the perception first, he added.
The real challenge is data portability. “When I’m using the cloud and the company providing it to me blows up, goes bankrupt or decides they don’t want to be in that business anymore, can I move all of my data, all of my applications, all of my processes to another provider?” said Stewart. “Who owns that data? And when I move it, does everything work the way it did on the old system?” Until this actually happens, it’s hard to prove how it will work.
Portability is going to be a much tougher challenge than security or reliability, so we’re going to see governments and large enterprises only cautiously moving some of their data into the cloud. Cloud computing was roughly a $50 billion market last year and it’s expected to be $80 billion this year, said Stewart, but of that growth almost all is going to come from consumers and small businesses. “If the Government of Canada loses all of their tax records, we’ve got a problem,” he said. “Government is moving toward the cloud, but they’re moving the way clouds do – not terribly quickly.”

But longer-term, governments will need to consider what their core competencies are – and whether or not that involves running networks. “Organizations that focus on what it is they do really well and outsource the things that are non-mission-critical usually save money and usually the quality of IT goes up,” said Stewart.

Transforming service delivery

“Our collective challenge in the public sector is to transform service delivery,” said Corinne Charette, CIO of the Government of Canada, who also spoke at the Public Sector CIO Forum. The phenomenon threatening the public sector, she said, is the unrelenting growth of IT projects – and that complexity has mushroomed to proportions we all need to start worrying about.

This is compounded by redundant applications, weak interoperability and cost-prohibitive upgrades. “The legacy landscape is starving investment in innovation,” said Charette. The federal government spends $5 billion annually on IM/IT; in 2007 only 13 per cent of that budget went to innovation – which isn’t much if government is looking to transform service delivery.

So why is project spending exploding when the cost of technology is decreasing and performance is increasing? It boils down to complexity. “There needs to be a focus on simplicity if we are really going to transform service delivery,” said Charette. One of the root causes of complexity is legacy. “Legacy is killing us, [but] legacy does not have to kill us. We’re spending too much on just keeping the lights on.”
Government is dealing with a hodgepodge of legacy, from HR to financial systems. There are more than 30 HR systems of the same flavour, for example, in federal government. “There’s more than one of just about everything,” said Charette. “Provinces redo the work, municipalities redo the work. Why are we stuck there? Every time we put a bolt on legacy, we delay the inevitable.”
The biggest problem is that government requirements are almost always over-engineered. Commodity components are repurchased, reinvented and re-customized. “Infrastructure is not unique,” she said. “Infrastructure is a commodity. How much of our budget is tied up with people and projects that are infrastructure-bound?” Projects are too definitive and too big at the get-go – mega-projects with mega-price tags. They’re hard to scope, estimate and procure, and they’re impossible to manage. And this is contributing to spiraling costs, unmanageable projects and flailing credibility with the public.
Another problem is the very nature of the public sector itself – departments and jurisdictions have their own political mandates and priorities, causing them to implement the same systems over and over again. “It makes no sense,” said Charette. “We just cannot afford to build the same systems over and over again.”
In a recent study on why IT spending isn’t creating more value, PricewaterhouseCoopers found that productivity growth has declined since 2002. Now we have a structural deficit as a result of the recession, meaning there’s an awful lot of pressure – and momentum – around getting value out of our IT spend, said Ellen Corkery-Dooher, partner in the advisory services practice with PricewaterhouseCoopers. “While we may have been a bit slower in the public sector to address the IT spend, now there’s a real, visible, tangible pressure to open up the dialogue.”
One of the first challenges is getting a grip on that IT spend. In the private sector, the concept of total cost of ownership – what’s their core business, and what parts of the IT spend they should be involved with – is a robust concept. But that traditionally has not been the type of dialogue that goes on in the public sector. If they don’t have a grip on their costs and service levels, however, it’s difficult to manage out complexity or focus on innovation.
Since Y2K, there’s been a high operational spend in IT, with a duplication of infrastructure and proliferation of business applications. Maintenance costs have become a significant portion of the budget, with very little investment in integration of legacy applications. At the federal level, there’s now a management accountability framework in place and a push toward shared services. “But they have to get a handle on their costs and look at the spend horizontally across the organization,” said Corkery-Dooher. “There’s far too much spend in the government on maintaining old systems and applications.”
One of the biggest challenges government now faces is interoperability. “We’ve invested in all these great applications but we don’t share business processes from a citizen perspective,” she said. “I can’t easily be viewed as a single client because the infrastructure doesn’t allow for interoperability, unlike a bank.”
We can deliver more value if we move toward a continuous improvement approach, she said, building platforms and applications that can evolve instead of having to be replaced. “There’s a real lack of creativity around thinking about new ways to do things, new ways to deliver value to internal or external clients,” said Corkery-Dooher. “A recession is a good thing. We shouldn’t waste it. It’s an opportunity to think about how we invest our dollars and get value out of that.”
In today’s world, we don’t have to be tied to buildings or locations – and cloud computing is part of that formula. “It’s not a new idea, but I would say it hasn’t really been considered strategically or embraced in the public sector in particular, so there’s a lot of opportunity there,” she said.

The arch-challenge, said Charette, is we have a culture in government where consultation trumps practicality and prompt decision-making. It’s necessary, it’s worthy, but when unchecked, unmanaged and unconstrained, it becomes an end in itself and a huge consumer of project time, resources and money.

Exit mobile version