Diversity isn’t an issue just for human resources departments. These days, it’s a fact of life in most IT departments. Just ask Curt Allen, senior server architect at Central Maine Power Co. in Augusta, Me.
Allen manages a network that includes IBM Corp.’s OS/390 on the mainframe; IBM’s AIX on RS/6000 servers; Windows NT 4 and 2000 for file and print sharing; Research Triangle Park, N.C.-based Red Hat Inc.’s Linux on Web servers; and a few OS/2 servers. Clients run several flavours of Windows, and there are some Macintosh workstations in market research and advertising, not to mention the handheld devices in the field.
“Staffing all the necessary skills in a multi-OS environment is a real challenge,” says Allen. “As a result, I have to wear a number of hats.”
His situation is far from unique. Many enterprises are adopting a best-of-breed approach to server operating systems, in which the companies pick the best environment for a particular situation and thus end up with multiple operating systems to manage. The rise of Linux is helping to drive that trend.
Although managing a mixed server operating system network requires a more diverse set of skills and tools than running a single environment, there are ways to limit the complexity. The growth of accepted standards and new interoperability tools make it possible to take a piece of hardware or software from any vendor and fit it into the network.
Workforce decentralization is playing a definite role in the adoption and tolerance of multiple server operating systems. California’s Department of General Services, for instance, had to rethink its computing infrastructure due to soaring office space costs.
“We’ve been told to supply one-third of our staff with the ability to telecommute within the next one to two years,” says Jamie Mangrum, operations manager for enterprise services in Sacramento.
But the department isn’t set up to enter 6,000 employee homes to configure workstations and deploy software. For an agency already running Microsoft Corp.’s Windows, Novell Inc.’s NetWare and legacy programs, managing an additional batch of operating systems in employee homes requires a simple means of integration.
So the department implemented server-based computing using Windows 2000 Terminal Services (WTS), which allows applications to reside on the server so workstations don’t need individual set-up. With WTS, a seat licence must be purchased for each user. But if the client will be accessing only the network through WTS, the price is about half the cost of a full Windows 2000 license.
WTS is based on Independent Computing System Architecture from Citrix Systems Inc. in Fort Lauderdale, Fla., which allows clients running Unix, Windows, Macintosh or Windows CE operating systems to access programs running on Windows or Sun Microsystem Inc.’s Solaris application servers.
In essence, WTS delivers a Windows 2000 desktop to remote computers or handheld devices via the keyboard or mouse input from the client and the display output from the application server. Thus, applications can be run remotely over low-bandwidth.
But the Citrix Metaframe/WTS rollout didn’t go so smoothly at the government offices of Sonoma County, Calif. “Citrix has limited printer driver support, and we noticed unexplained slowdowns on Novell print servers with applications like GroupWise,” says John Forberg, Sonoma County’s IT director. To solve that problem, he abandoned network queues and installed a Windows print server on a file server.
Forberg mentions another problem. “Thin-client [computing] means users can’t bring in floppies or CDs,” he says, so it’s not uncommon to experience some user pushback. Nevertheless, he points out that this architecture saves money. “Users might not agree,” he says, “but they don’t pay the bills.”
A thin-client model isn’t the only route to server operating system interoperability. Another popular approach is the use of standard protocols to bridge operating systems and applications.
William Zachmann, a consultant at Meta Group Inc. in Stamford, Conn., says, “XML offers a very broadly operable set of standards and protocols that can be used to communicate among online systems.”
The use of TCP/IP instead of proprietary architectures has also reduced device dependency (when a particular device, like a Macintosh computer or a Palm Inc. handheld, requires a particular operating system). That’s driving the rise of Web-enabled applications.
“We wanted a thin-client or device-independent architecture to make it easy to access corporate applications over the Internet,” says John Townsend, manager of network operations at DTE Energy Inc. in Detroit. He chose Enterprise 3 software from Santa Cruz, Calif.-based Tarantella Inc. The software, which resides on its own Linux or Unix box, acts as a middleman by taking keyboard and mouse inputs from clients, converting them into the appropriate protocol and sending them to application servers. The Enterprise 3 box then receives data from the application, converts it to a Web document and sends it to the user. All the client requires is a browser.
DTE Energy primarily uses Unix for enterprise applications, but runs Windows NT on department-level application servers. A user goes to an internal Web page that contains a menu of applications, both Unix- and Windows NT-based. After clicking on the application and going through normal authentication procedures, the Tarantella software launches the application.
Enterprise 3 also came in handy when DTE acquired a gas company in a different city. Townsend explains, “As we migrate them to the DTE site, they use Tarantella to access their corporate applications so we don’t have to move the applications over.”
Although the integration of diverse server operating system elements can now be accomplished with relative ease, administration is another matter. If management of the network places a burden on IT, it can defeat the purpose of a thin-client or standards-based approach to running multiple systems.
Central Maine Power, for example, wanted to avoid the lengthy installation times and high cost often associated with enterprise management frameworks. “We ended up spending too much time managing and maintaining our previous high-end management software,” says Philip Mourneault, the utility’s networking specialist.
So Central Maine Power opted for WebNM, a low-cost (and partially open-source) network management system from Somix Technologies Inc. in Sanford, Maine. WebNM works in tandem with WhatsUp Gold network monitoring software from Ipswitch Inc. in Lexington, Mass.
WebNM monitors and controls SNMP-enabled devices through a browser interface. At Central Main Power, that includes all the servers, routers, switches, workstations, printers and SNMP power strips, even though they may be operating on any of a dozen server operating systems.
“WebNM doesn’t give us the depth of information more expensive systems might provide, but it is more than adequate for our needs,” says Mourneault.
Despite all the work it takes to integrate server operating systems, it’s easier than before. “It used to be that each vendor had one or more proprietary architectures,” says Meta Group’s Zachmann. “Now it is basically down to IBM, Windows and Unix/Linux.” He predicts that the market will continue to consolidate along the Windows and Linux platforms.
Framingham, Mass.-based International Data Corp. agrees. IDC figures show that 41 per cent of server licences issued last year were for Windows, and 27 per cent were for Linux. Novell NetWare and Unix accounted for 14 per cent each. IDC says that five years from now, multiple server environments will still be very much in evidence.
All signs point to an acceleration of the move to server-based, thin-client infrastructures and standards-based applications. California’s Department of General Services, for example, began with one server, expanded to 14 servers and 350 users, and more are on the way. “I’ve been asked not to show WTS to anyone else,” Mangrum says, “because people want it right away.”
Drew Robb is a freelance writer in Tujunga, Calif. Contact him at[email protected].