In the time it takes to get a cup of coffee, any one of the hundreds of engineers and developers at mobile computing chip-maker Qualcomm Inc. can provision himself a new server — one that’s fully configured with compute, storage, networking, middleware and other resources. “You can get something provisioned within 15 minutes,” says Matthew Clark, senior director of IT.
The San Diego-based company’s self-service infrastructure, dubbed “AutoZone,” includes all the hallmarks of a private cloud: a self-service portal with a menu of standardized services that users can choose from; a virtualized infrastructure built using highly standardized suites of hardware and software; fully automated provisioning and deprovisioning; and the ability for IT to meter and allocate costs from the division level down through to the individual department, application or user.
But don’t ask Clark to call it a private cloud.
“We’re getting close to a cloud, but I would call our implementation a cross-platform, automated virtualized environment,” he says. His reticence is understandable because, as he points out, everyone seems to have a different definition of cloud computing. Clark’s is very simple, if broad: “Cloud is efficient use of resources across your infrastructure, which can also be external services.”
James Staten, an analyst at Forrester Research Inc., says Qualcomm is on the leading edge. No matter how Clark defines cloud computing, Staten says, “Qualcomm is a lot further along than most companies are.”
Clark’s peers are watching with great interest. Private clouds are rapidly becoming a priority for corporate IT heading into 2011. The core elements — on-premises virtualization, standardized infrastructure and service offerings, and automated provisioning — can deliver cost efficiencies and reduce administrative overhead. But the biggest driver by far is time to market, IT executives say. “We look at the benefit as quicker time to onboard applications, not cost savings. It’s all about speed of deployment,” says Norm Fjeldheim, Qualcomm’s CIO.
There’s another motivating factor at work as well, says Staten: potential competition from public cloud offerings, which IT’s customers are increasingly using as an alternative to traditional IT-hosted systems. “There’s this fear that the public cloud will make IT irrelevant,” Staten says, and many IT organizations are feeling pressure to have an internal “cloud answer” for the business side as soon as possible.
With all of the hype about cloud, it’s important to keep in mind that, unlike some public cloud services that promise instant gratification, private cloud is an infrastructure play that represents a long-term payback. “It’s not something you drop in today and immediately see a 50% return on investment,” says Mitch Daniels, technical director at integrator ManTech International Corp. “The expectations-setting really needs to be addressed upfront, especially with management.”
Defining Your Cloud
Fjeldheim defines cloud as “a complete abstraction of the hardware away from the applications that run on them.” What Qualcomm has done isn’t a cloud, he says, because it has virtualized classes of hardware, creating virtual infrastructure silos for Windows, Linux and Solaris environments. “We can’t run an application on any piece of hardware we have,” he says.
Key Attributes of a Private Cloud
* Highly standardized infrastructure
* Automated provisioning
* User self-service
* Catalog of services
* Usage-based cost accounting and chargeback
* Dynamic resource scaling as application loads change
Source: Computerworld interviews with IT executives.
Also, Qualcomm’s virtual infrastructure can’t yet scale to support application workloads dynamically — something Fjeldheim sees as a key attribute of an internal, private cloud. “We don’t have the ability for the virtual infrastructure to automatically provide additional infrastructure to an application that needs it, and then take it away when it doesn’t,” he says.
In addition to the core attributes of virtualization, standardization and automation, there’s another important element in a private cloud architecture: multitenancy. Cloud-based services must be shared across all of the organization’s departments and business units in order to pay off. “The economies only work when it’s shared,” says Staten. And that means business users must be willing to accept a limited set of standardized service offerings — and fewer customizations.
In other cases, the goal is to have not just the virtualized infrastructure, but everything up through the application layer become a set of standardized, shared services that all aspects of the business will share — something that Fjeldheim says his company is doing now for developers and engineers. Finally, says Staten, a private cloud should include the ability to track service usage and charge users for the consumption of cloud-based services.
Qualcomm isn’t the only company experimenting with cloud. Many IT organizations are starting pilots using one of these options: software-based architectures, such as the open-source OpenStack promoted by RackSpace, NASA and Eucalyptus Systems; or more proprietary, “cloud in a box” setups that tightly couple the software and hardware, such as IBM’s CloudBurst or Hewlett-Packard’s BladeSystem Matrix “converged infrastructure.”
Cloud-in-a-box products typically include a self-service portal, a cost allocation engine and all of the automation requirements for a defined menu of configurations, from the application, underlying databases and other middleware to virtual infrastructure and associated resources that will be assigned. These products enforce a degree of standardization right out of the gate and are a good way to get your feet wet, says Staten.
Rather than choosing one approach, Qualcomm has gone down both roads. “We went with a mixture of OpenStack and cloud-in-a-box solutions,” says Clark. On the open end, Qualcomm uses Eucalyptus as its back end, along with VMware, Computer Associates’ Automation Point tools, and products from rPath. The company is also working under nondisclosure with a vendor that’s developing a cloud-in-a-box system.
Roswell Park Cancer Institute in Buffalo, N.Y., went for a boxed solution, deploying HP’s BladeSystem Matrix in June in an effort to automate provisioning. The 11-blade chassis, which is still being configured, will eventually support 100 existing virtual servers, while another 100 new virtual servers will be built from a standardized set of templates, says Tom Vaughan, director of IT infrastructure.
He sees continued work to conform with ITIL standards for IT service and change management, and the development of a standard service catalog, as key to a successful rollout, since cloud automation must be built upon a set of standardized processes.
Going Public: The Hybrid Cloud
Private clouds aren’t necessarily hosted on-premises. There are hosted private cloud services from third-party providers that manage dedicated cloud infrastructures for their customers at off-site locations.
An advantage of such arrangements is that the service providers can offer “cloud bursting” — the rapid addition of compute capacity from the public, multitenant cloud to handle temporary periods of heavy loads. But IT executives remain cautious about such setups because they have concerns about security and control; they’re also leery of service-level agreements that seem better suited to small businesses.
Hot and Cold
Cloud computing was ranked as the No. 1 most overhyped technology but was also named the No. 2 technology with the most promise for 2011.
Technology most important to your organization in the next 12 months:
* 1. Virtualization
* 2. Cloud computing
Source: Computerworld’s exclusive Forecast survey of 209 IT professionals, June/July 2010
“Security is a big, big deal for us. And not just security, but the perception of it,” says Fjeldheim.
Wake Technical Community College in Raleigh, N.C., built a different kind of hybrid. The school is part of a consortium that developed a hosted private cloud that serves as a virtual computer lab (VCL) for students. The VMware-based system, built on IBM BladeServers by the Microelectronics Centre at North Carolina State University, automatically provisions a virtual machine when a student logs in. Each image lasts for up to eight hours before it’s automatically deprovisioned.
The VCL system eliminated to the need to open a new 50-seat lab, saving $50,000. But CIO Darryl McGraw says the school did run into one issue with a private cloud hosted by a third party. The fall semester starts earlier at Wake Technical College than at North Carolina State, and just as classes started at the former, administrators for the VCL took down the servers for maintenance. “The concept of multiple tenants on the same hardware can be problematic,” McGraw says.
He has also limited the cloud to student labs, as opposed to, say, the ERP system. “The worst that can happen is that someone gets their homework hacked into,” he says. Overall, however, he considers the VCL to be a success, and he plans to push 137 physical computer labs into the VCL cloud over the next five years.
Using commercially hosted private clouds also presents contractual challenges. The CIO at one Fortune 500 company, who is currently negotiating with a major cloud service provider, says it’s not always clear who’s responsible when things don’t work — or how the problem will be resolved.
Providers expect to be able to change terms and conditions in contracts at a moment’s notice, he complains. And he says that the service-level agreements he’s seen are totally inadequate. “They want to limit their liability so greatly that — why bother?” he asks, adding, “If I offered those types of SLAs to the business, I’d be out of a job.”
The CIO, who asked not to be named because he’s still in discussions with the vendor, says that in many cases, such contracts simply aren’t designed for enterprise-class, mission-critical applications. The negotiations with vendors so far, he says, have been “burdensome.”
But the need for rigid standardization of processes, services and infrastructure that underlie the cloud may be the biggest nut to crack in the year ahead. Problems come when IT — or its customers — begin customizing the standardized catalog of services because they don’t like the predefined choices.
“For a private cloud to work, the IT shop has to deliver highly repeatable, consistent services with minimal or no customization,” says Nick Van der Zweep, director of business strategy for industry standard servers and software at HP. For example, HP offers just three configuration templates for hosting an Oracle RAC database in the cloud: small, midsize and large. To host Exchange in the cloud, the administrator can choose only the number of users.
Straying from the path can lead down a slippery slope, since without a finite, standardized services catalog, the benefits of the system are greatly reduced. “All these things I change turn off the ‘cloudiness’ of it,” Staten says.
The only way to succeed, he argues, is to make it more costly for users to go outside of the standard offerings. Users who follow the rules get a system in 15 minutes and a neatly packaged services-based chargeback model. Those who need a custom configuration must wait longer — and pay more.
But he doesn’t think the chargeback model will fly in many organizations, and he contends that this is what will hold up broader adoption of internal clouds. “We’re confident that 95% of IT shops aren’t ready for that,” says Staten.
Fjeldheim sees IT process standardization as a key to success. “We have a very standardized way of doing things, good automation and process control,” he says, adding that the company recently earned the ISO 20000 IT service management and ISO 27000 IT security management systems certifications. But that doesn’t necessarily mean you have to create different processes or tool sets for managing cloud and managing traditional infrastructures.
To Daniels at integrator ManTech, the key is to identify services that can be reused across the organization — and stick with them. The problem, he says, is that each user organization wants things done slightly differently.
“You get into this ugly business-process re-engineering that has to go on. You say you have to do it this way, and then you have a rift with the user community,” he explains. Those who have gone down this route with service-oriented architectures will have a leg up with internal clouds, he contends. Others, he says, may not appreciate the investment they’ll have to make upfront to take full advantage of a private cloud architecture. “You’ve got to do your homework on the application side.”
John Fiore, CIO at Bank of New York Mellon, worries about vendor lock-in with cloud-in-a-box offerings as he ponders choices for a pilot project. “There really aren’t any standards at this stage of the game. If you go down a certain path, that’s yet another new architecture you’re accepting — and what might be the life span of it?” He says he likes the idea of cloud-in-a-box offerings that bundle all of the necessary hardware and software into one neat, tidy package but adds, “We’d prefer not to have a tight coupling between the two. Until they evolve to a point where a loose coupling is achievable, there are definitely some business risks here.”
As with early virtualization deployments, most private cloud pilots focus on relatively safe, internal IT functions, such as testing and development. Bank of New York Mellon, for example, plans to focus early cloud-automation efforts in the development and quality assurance areas. Wake Technical focused on student labs, where student homework is the only thing at risk.
At Roswell, Vaughan says his first test will be to allow the IT staff to automatically provision virtual machines for testing purposes, before moving on to the server team and application developers. “Instead of going through the paperwork of requesting that a server be built, they will be able do it themselves,” he says. And as the centre rolls out virtual desktop technology to its Citrix clients next year, Vaughan says the back-end servers will move into the cloud. “I’d like to push out as many services to people as we can,” he adds.
That’s the plan at Bank of New York Mellon as well. Fiore says the bank plans to aggressively explore hosting an internal cloud but will start in the development and testing area, where “a lot of firms are cutting their teeth.” Like Fjeldheim, he says the goal is faster deployments. “Time to market is the thing our customers are demanding. Things like [the Troubled Asset Relief Program] require us to act quickly and with great agility,” says Fiore.
Broader cloud deployments beyond testing and development are still at least three years off, Staten says. But a private cloud ultimately has the potential to take over large tracts of the data centre, especially for Linux- and Windows-based applications with variable workload demands, such as in the areas of high-performance computing, and collaborative or Internet-based applications. Less suitable are workloads that are very stable, where the equipment is fully depreciated and where there are no requirements to upgrade in the near future.
Qualcomm is unlikely to migrate its ERP system to the cloud, because its resource needs are highly predictable. The benefit of the cloud lies in its ability to scale in an automated fashion, Clark says, “but with ERP, we know when we have to scale and we can scale it.”
Nonetheless, more and more workloads are evolving in that direction. “Our portfolio of applications is gradually moving toward the commodity [servers]. As that happens, the appropriateness of cloud rises,” says Pete Johnson, chief technology officer at Bank of New York Mellon. McGraw says the initial costs of setting up a private cloud at the college were high, but in the long term, “it’s worth it.” Vaughan is also bullish on the long-term role of a private cloud architecture. Some functions may never fit into a cloud architecture, but eventually, he says, “I think it will pretty much take over the data centre.”