For the past three years, HotSchedules.com Inc., an Austin-based provider of online labor-scheduling services, has experienced annual revenue growth of over 100 per cent — a boom that could have cost it $60,000 in server hardware purchases.
Yet HotSchedules.com has managed to support an ever-expanding network, enhance its in-house server security and deliver maximum uptime to nearly 375,000 users — all while cutting hardware expenditures and stabilizing electricity costs.
The answer was virtualization, an increasingly popular technology as cash-strapped companies aim to cut capital spending on server hardware. Essentially, virtualization is a layer of software that lets companies consolidate several of their in-house servers onto a single piece of hardware. The upshot: the power of dozens of servers for a fraction of the price and space.
But virtualization wasn’t HotSchedules.com’s first stab at boosting its computing power. Back in 1999, it rented server power and storage for a monthly fee from a third-party provider — essentially what marketers today call cloud computing. However, poor customer service, mounting costs and limited capacity prompted the company to finally make the switch to virtualization.
“The vendor provisioned a couple of servers and some memory for us, but that’s all we got,” says Matt Woodings, HotSchedules.com’s chief technology officer. “We received poor customer service, and it just wasn’t the best experience.”
Companies such as HotSchedules.com are fast discovering that there’s no such thing as a one-size-fits-all approach to bolstering your server and storage capabilities. Some argue that virtualization high-priced in-house expertise in exchange for greater security. Others applaud cloud computing’s same-day scalability while at the same time questioning its overall reliability.
Even the cost savings loudly touted by well-respected vendors are being hotly debated. A controversial March 2009 study by consulting firm McKinsey & Co. concluded that for large businesses, shifting IT work to the cloud can be more costly (and less reliable) than owning the hardware in-house.
So how can a company decide whether virtualization or cloud computing is a better bet for its needs?
Mark Tonsetic, a program manager at The Corporate Executive Board’s Infrastructure Executive Council in Washington, says the key is to make a selection “on a project-by-project basis, based on the nature of the application or data that’s being supported.” Each project, he adds, should be evaluated using criteria ranging from server workload demands and disaster recovery requirements to security risks and vendor.
In the case of HotSchedules.com, the deciding factor was the need to accommodate growth while keeping expenses down. On the brink of signing a large new client, the company realized that it would have to purchase $60,000 worth of server hardware, doubling the size of its data center, in order to accommodate the staggering computing demands of a sizable restaurant chain.
“That’s a huge expense to outlay right out of the gate,” says Woodings. “So I thought, Well, there’s got to be a better way to do this. I’ve got to get more bang for my buck.”
Since deploying Microsoft Corp.’s Hyper-V virtualization technology last year, HotSchedules.com has consolidated 42 physical servers down to four. By multiplying capacity by 10 times, the company has been “able to take on a considerable amount of clients without seeing any kind of increase in monthly expenses,” says Ray Pawlikowski, HotSchedules.-com’s CEO. For example, the company pays an average of $12,000 per month in energy bills — a figure that Pawlikowski estimates would have spiked to $20,000 a month by adding brand-new servers to the data center.
Cloud computing may offer low rates for basic packages, but Woodings warns that “once you start adding the bells and whistles that our clients request, then suddenly that price range becomes prohibitive.” These more sophisticated features include iPhone scheduling apps, automated English-Spanish phone service and instant notification of shift changes.
But external cloud services can be a boon to businesses that don’t want the hassles of keeping up with burgeoning data growth.
That was the case for FreshBooks, an online invoicing and time-tracking unit of 2ndSite Inc. in Toronto. The company could either farm out the storage of photos, design logos and spreadsheets for its nearly 1 million customers or continue to contend with huge “infrastructure management headaches,” says CEO Mike McDerment. He knew that constantly scaling FreshBooks’ servers to meet the fluctuating storage needs of its primarily small-business clientele could only spell trouble. So the 33-person outfit signed on to Cloud Files, a service from Rackspace US Inc.
“While we are a technically sound company, scaling vast amounts of data quickly is not core to our business,” says McDerment. “The storage of these [document] files is painful and costly. By taking that piece away from us, [Rackspace] enables us to focus on writing our applications.”
That wouldn’t have been the case with virtualized servers, according to McDerment, who says maintaining the project in-house would have “left us with the same problem of scaling all kinds of different servers to achieve the needs of our customer base.”
Better yet, by dipping its toes into cloud computing with a low-risk application such as document storage and management, FreshBooks reaps the rewards of cloud computing without losing sleep over the odd outage or technical snafu. After all, much-publicized service outages have plagued cloud computing from the beginning. That’s why McDerment carefully weighed the technology’s benefits against the amount of downtime expected in today’s cloud service-level agreements.
“People need to go in with open eyes and ask, ‘If cloud computing is supporting my company’s core service, can 45 minutes of downtime be acceptable?’ ” says McDerment.
The occasional cloud outage is a luxury Bill Gillis couldn’t afford, however. Gillis is the manager of clinical application services at Beth Israel Deaconess Medical Center. The Boston-based teaching hospital recently deployed technology from virtualization giant VMware Inc. that lets more than 200 private physician practices throughout Massachusetts access electronic health records. Given the highly confidential nature of the medical data, Gillis says the choice was obvious.
“Because it’s patient records, we wanted to own the space where that information resides. Sure, you could have a vendor sign whatever confidentiality agreements you need, but it opens up a risk,” he says. “If a [vendor’s] rogue employee sells all of our AIDS patient data to some medical research company, we’re responsible for that breach of security without having any real recourse. At least with virtualization, it’s our own staff, and we own that data.”
In the end, though, choosing between cloud computing and virtualization is as much about looking to the future as it is about assessing present-day needs. After all, companies need to be able to grow with their computing power and storage capacity for years to come. “In today’s environment, where capital comes at a premium,” Tonsetic says, “organizations have to look very hard at their capacity needs in the next two or three years, and whether the capital investment they make today is the right decision.”