Today, many companies are so afraid of losing a perceived competitive advantage that they take on additional IT risk by implementing leading-edge technologies they don’t necessarily need.
As a result, they appear to be running fast but they don’t increase productivity, reduce backlogs or even provide cost-effective solutions.
Two opinion pieces I read recently reminded me that it is easy to fall for the glamour of new technologies:
“Mainframes are Dead, Long Live Mainframes” (IEEE Computer, August 1999, p. 104). Ted Lewis shows that mainframe manufacturers continue to ship more MIPS each year. In addition to traditional transaction processing, many organizations now use mainframes as large Web servers. Lewis argues that using a couple of mainframes for a Web site is more cost effective than using a server farm with thousands of quad-Pentiums.
In addition, based on data from Gartner, mainframes are much more reliable. For example, an IBM S/390 has a reliability of 99.998 per cent over a 365-day period vs. PCs running Windows NT that have a reliability of 97.44 percent.
And in “Cobol: A Historic Past, A Vital Future?” (IEEE Software, July/August 1999, p. 120) Robert L. Glass states that Cobol continues to be the most-used programming language for systems development. Overall, use of Cobol continues to grow. Cobol 2000 includes features such as support for object-oriented development, distributed objects, component-based development and Web programming. Glass argues that Cobol is still the best language for business development (e.g., decimal and currency related math), and has a higher machine-independence than Java, with compilers available across all types of hardware. Cobol can also be used for Web site development. Glass referred to a story from the Army and Air Force Exchange Service where a team of Cobol programmers built a Web site for an on-line catalogue using Micro Focus’ NetExpress. One person on the team said: “Our team averaged about 15 years of Cobol mainframe experience, but no one on the team had Web GUI design experience.”
During the ’80s there was a great push, led largely by the vendors of mini-computers, to move away from centralized computing to a distributed two-tier client-server architecture. Looking back, the promises have not been met. In fact, for most applications, the total life-cycle cost of buying distributed servers, providing remote maintenance and support, and distributing software outweigh telecommunication and centralized server costs. With the advent of the Web we have seen technology go almost full circle and become essentially mainframe-centric again.
We were also promised improved productivity from CASE tools, 4GLs, and object-oriented languages, but none of them have fulfilled their promises. Many programming languages and tools have appeared and almost as quickly disappeared. Even accepted languages like Ada and C++ have not gained prominence as expected. C++ use appears to be declining as developers substitute Java. Costs have also escalated through salary premiums for developers with the latest technologies, and the requirement to re-tool development environments and provide re-training programs.
Rather than chasing every new technology “advance” we need to pause and ask a few simple questions:
Will we really be less competitive if we don’t adopt the latest development tool or architecture?
Will the new technology help us focus on automating the right business functions, correctly?
Will it provide real value that we cannot obtain with our current development tools and languages?
Will that value out-weigh the re-training and re-tooling costs?
Are we being held captive to the latest “cool” technology fad?
How will this new technology help us improve our development discipline and productivity over the next few years?
James Hughes is a vice-president of software development for EDS Systemhouse in Toronto. He is currently project director for Ontario’s Integrated Justice project. He can be reached at firstname.lastname@example.org.