As the year 2000 spending freeze comes to an end and CIOs help draw up budgets for long-delayed projects, there’s one word that both business managers and IS should keep in mind, according to one industry analyst: e-legacy.
Carl Greiner, vice-president and director of enterprise data center strategies with IT research firm Meta Group Inc. in Stamford, Conn., said there’s a “huge backlog” of IT projects ready to roll once the work associated with fixing Y2K subsides.
Business managers and IT both want these projects, typically concerning e-commerce or ERP-type initiatives, to begin as soon as possible, so they can remain competitive. But neither camp should automatically assume that they’re facing radical changes to their current architecture.
“Understand the weaknesses and the strengths of the infrastructure that you have,” Greiner said. “Let’s look at some of the legacy code that supports this from an IT perspective, and look at the IT infrastructure from on overall operational reliability [point-of-view].”
Enter e-legacy. Though few dispute the still-dominant role of the mainframe in today’s IT environments – mainframes still house approximately 70 per cent of all corporate data — there is some confusion over what role big iron will play in an increasingly Web-enabled world.
The battle of whether or not to consolidate is hardly new for IT. But, according to Greiner, there is a renewed trend toward server consolidation. And organizations that depend heavily on their legacy systems may not require IS to make the kind of adjustments some think are necessary.
“We’re saying there’s a lot of legacy code out there, and you’re going to have to use it if in fact you want to be truly effective in some of your e-commerce type efforts. As a result, we coined the term ‘e-legacy’ because I don’t know anyone who can truly deploy an operational e-commerce system without using their legacy systems, of some type, at some level.”
For instance, many aspects of e-commerce require front-end access to back-end legacy applications, including data warehouses, Greiner said.
Ideally, IT and business can draw up an architecture plan that will see the organization leverage all the mainframe functionality they can, utilizing but minimizing any Unix or Windows NT-based applications, and then integrate them.
Greiner admits that the mainframe is not without its shortcomings. Costs associated with them are higher than with open platforms. As well, few vendors are producing software specifically for the mainframe. And mainframes can’t run popular Unix databases.
“(But) that’s going to dictate where I deploy some of these systems. It’s not an ‘or’ — it’s an ‘and.’ I’m going to have the mainframe. I’m going to exploit that as far as I possibly can, but I’m also going to have Unix and NT in my shop,” Greiner said.
And of all the mainframe-type platforms currently available, IBM Corp.’s S/390 is best equipped to help integrate the back end while leveraging e-commerce potential, Greiner said. “The S/390 is probably the premier platform right now from the standpoint of reliability and availability…there’s nothing that scales like it, and there’s nothing that has the reliability of it, no matter what you read.”
Greiner said the latest versions of OS/390 come with “world class” TCP/IP capability, excellent workload managers and features IBM’s Parallel Sysplex clustering software. And IBM recently built Unix System Services into the platform to help IS shops port Unix applications to the OS/390.
Unix and NT-specific servers, meanwhile, are impossible to scale effectively, Greiner said. Nor do Unix and NT have the uptime track records to match the S/390.
Greiner is also warning IS not to be bullied into unrealistic service-level agreements. Instead, he recommends IS take a careful look at what users need, as opposed to what they say they need.
“Not everyone has to run the damn systems 24 by 365, and they don’t all have to be backed up with one-hour recovery time. Some of these systems just aren’t that critical.”