Parsons Corp., a US$3 billion construction and engineering company based in Pasadena, Calif., once had hundreds of fat clients on the desktops of its engineers. That spelled nothing but trouble for the IT staff. “We had cadres of IT folks who would go around with CDs, and they’d push the user aside and say, ‘Hey, go have a smoke while I download this application,'” says CIO Joe Visconti.
That was only the beginning. “If it was something like AutoCAD, it could take an hour to load, then the IT guy would have to configure it,” he recalls. “Then he’d get a call a few minutes later saying, ‘Hey, this is not running. Help me.’ Then, as soon as there was a patch or new release, someone would go through all the desktops again.”
Keeping track of which users had which versions of an application, who had various patches and so on was a nightmare, Visconti says. And if a user needed multiple versions of software for different engineering projects, the versions had to be installed and uninstalled as his needs changed.
That was the lay of the land in most IT shops as the century turned, and it’s the way things still are today at many companies. But new models of computing are taking hold as IT looks to reduce the cost and complexity of managing PCs. Among these are the virtualization and streaming of desktop applications, with the goal of moving the management of desktops to the data center, where it can be done more easily, more securely and often more cheaply.
The virtualization and streaming of applications evolved from a long heritage. In the 1970s, dumb terminals connected to mainframes. The big desktop boxes were aptly named; all they did was collect keystrokes and deliver boring green text. Then in the 1980s came minicomputers and PCs, connected in a paradigm-busting arrangement called client/server computing. These desktop machines were far from dumb; they were called “fat clients” because they were fully loaded with processors, memory, disk drives, I/O devices, operating systems and application software.
In the 1990s, things got a bit more complicated. IT managers discovered that still more tiers could bring even better performance, flexibility and scalability. Applications could be broken into presentation, business logic, data access and data storage layers, each residing where it worked best.
At the same time, there was a backlash against the cost and complexity of fat clients, and some IT managers turned to “thin clients” and “network computers,” basically dumb terminals with a grade-school education.
But these days, operating system upgrades, new applications, bug fixes and security patches have escalated in frequency. Users are more likely to install their own applications and even demand that IT install special software for them. Substantial portions of IT staffs travel from desktop to desktop, keeping PCs running properly.
Enter virtualization — which isolates the application from the operating system and other applications — and streaming, which delivers the application to the user.
By moving the management of desktops to the data center, this combination can reduce hundreds of desktop environments to one that’s under lock and key, while giving the user the illusion that he still has a fat client. Or a server can hold multiple desktop images, each tailored to a specific user’s work based on profiles stored in a directory.
Then, when the user needs them, those applications — and sometimes complete operating environments — can be “streamed” over the network to the desktop, where they execute locally, without the server and communications overhead that comes from traditional client/server or thin-client computing. Some products allow the streaming of just those pieces of software actually needed for that session — perhaps just 20 percent of an application’s code — minimizing the demand for bandwidth, memory and disk.
Virtualization allows the streamed applications to reside in their own self-contained operating environments. They can be encapsulated, with their own Dynamic Link Libraries (DLL) and registry settings, so that multiple versions of an application can coexist without conflicting. When something goes wrong — say, a PC gets a virus infection — a new desktop image can be streamed to the user without a visit from IT.
But there are some caveats. A robust network is required to avoid delays while streaming occurs — although applications can sometimes be started before they are fully downloaded, and parts or all of commonly used applications may be cached locally. If a connection can’t be maintained, as with a laptop in motion, whatever software is needed until a connection is restored must be cached to a local disk. And there are enough differences from traditional computing methods to require some attitude adjustments on the part of both users and IT support staffers.
At Parsons, Visconti installed a streaming tool called AppExpress from Endeavors Technology Inc. in Los Angeles two years ago. Now, he says, “I can get anybody up, anywhere in the world, on any application in five minutes. As soon as I have a patch, I patch it on the server, and in a few minutes, everybody has the new version.”
But AppExpress is not a conventional software-distribution tool. A Parsons engineer indicates via a Web portal which project he is going to work on, and a homegrown configuration management system instructs AppExpress to stream the needed applications to his desktop. It also streams configuration parameters related to printers, plotters and other devices.
“First, the server establishes whether the user has the application on his desktop,” Visconti says. “If not, it streams just enough so it starts to execute.”
While taking inventory of desktop contents, AppExpress can also find and report bootleg software, he says.
Visconti says Parsons may have saved as much as $1 million last year from the streaming technology, which served 600 PCs. “We are cutting the cost of IT support almost to nothing,” he says.
And Visconti says the “on-demand” nature of streaming — the user gets the application only when he needs it and for only as long as he needs it — has important software licensing benefits as well. He is striking enterprise agreements with software vendors that allow payment based on actual usage, which is determined at the end of each quarter. That kind of agreement enables Parsons to install all of a vendor’s products on AppExpress servers while paying for only the ones actually used, he says. And new applications get to users in minutes, not days.
The Cleveland Municipal School District takes a slightly different approach, using a pair of complementary products for software streaming. It uses the Software-Streaming Platform from Ardence Inc. (recently bought by Citrix Systems Inc.) to stream a standard “base layer” — the operating system plus the core applications that all users need, such as Microsoft Office and Adobe Acrobat — on each of 15,000 PCs in 104 buildings.
If a local machine becomes infected or corrupted in some way, it is simply rebooted using a new desktop image streamed from the data center, and the user is back up in minutes. “The dream is to get all the desktops identical, then worry about layering applications on top of that,” says school district CIO Thomas Bender.
At the application layer, the school district uses AppStream from App