Site icon IT World Canada

Exercising application portfolio management disciplines

IT World Canada is pleased to offer research from Meta Group Inc. This content will be featured on ITWorldCanada.com for three months from the article’s publication date.

IT organizations (ITOs) are exercising application portfolio management disciplines and getting organized for the inevitable: the sun is setting on outdated, marginal, or cost-prohibitive applications. Continued cost pressures and changing and expanding requirements, coupled with legacy integration challenges, are driving more ITOs to full-scale application portfolio analysis. Although picking retirement candidates might prove easy, the actual “retirement,” which includes complicated extrication and often long-term data requirements, demands strong project plans, tools, and technical skills.

META Trend: Enterprises will adopt an enterprise portfolio management approach to strategically and tactically deliver business value, optimize all enterprise investments, and lay the groundwork for a technologically sophisticated business strategy. Integrated enterprise-level strategy/planning, architecture, and program management will become key competencies supporting enterprise portfolio management in 50% of the Global 2000 by 2006.

Whether it is the financial, telecommunications, or retail sector, all ITOs are being asked to do more with less. Part of doing more with less often results in a disciplined and systematic applications portfolio assessment, with the outcome of enhancements for some and the distinct possibility of retirement for others. Essentially, ITOs are learning to work their portfolios for better business/IT alignment that often results in fewer resources to run and to maintain. Not surprisingly, expensive and complex (and increasingly marginalized) legacy-centric systems are often in the bull’s-eye. Application portfolio management is not only for uncovering what applications are prime candidates for retirement, but also includes overall delivery streamlining. For many, application portfolio analysis is often a step in preparing for a more flexible on-demand environment. Moreover, the same portfolio retirement skills and methods can be very helpful in improving and documenting applications that still have a long operational life. In many cases, users turn to tools to help rate and compare applications or seek third-party suppliers for assistance.

Although a portfolio assessment generally provides a thorough application inventory, the inventory is often devoid of the connection and the interapplication “hooks” that are typical of long-lived undocumented or poorly documented applications. This is often the starting and realization point for the need for additional tools that can help uncover not just the “whats” of the application inventory but also the “hows.” Before the actual decommissioning and sunsetting, the all-too-often ad hoc connections and hooks have to be uncovered, and new connection solutions must be put in place.

The second portfolio sunsetting challenge is the aged application’s data and potential future use of this data, from both legal and ongoing operation considerations. Simply put, while an application might be destined for the junk pile, the data often is reused and repurposed. Therefore, it could well have another five to 10-year or longer window. Part of the application decommissioning is a plan to keep the data fresh and flexible, which points to mature archiving alternatives.

Discovering Before Decommissioning

Years of ever-changing requirements, multiple development approaches, and neglected quality assurance has left many legacy applications in a state of confusion. Although overall functioning may be correct, a great deal of integration and connection technology is often unnecessary, with feeds no longer providing value. The dilemma is how to uncover this type of info in a systematic automated manner. Obviously, programming staffs and operations groups can be queried, and as a last resort, code can be reviewed. However, this uncovers only 30 per cent to 50 per cent of needed (and unneeded or unused) application relationships. Users are turning more to tools like IBM’s WebSphere Studio Asset Analyzer (WSAA) to do the job, which META Group classifies as TRMs (technology relationship mapping tools). For example, code from partitioned data sets, source code from source control library managers like IBM’s SCLM and Serena’s Changeman plus JCL, process and control cards, and CICS and IMS online regions are input for WSAA. WSAA also views applications, data sets, and batch jobs to piece together a resource chart that can used to determine an application’s connection map. (WSAA’s input is stored in a DB2 database.) A review of WSAA’s legacy information can expedite an application’s correct sunsetting activities, because all connection points (those actually needed, and those that are included but not needed) can be ascertained.

For a small percentage of legacy applications, some modernization and integration has already been accomplished using solutions like WebSphere, WebLogic, or other object and object-like paradigms. This has resulted in more connection points that need to be explored, and non-MVS resources have to be ascertained and entered into the discovery process. Users will have to provide WAR and EAR files, Java source and byte code, XML, HTML and more. Although WSAA has scanners for this, the market is awash in maturing TRM alternatives that focus on Unix, Intel, and other non-MVS platforms and applications (see SMS Delta 1146). Although most of the non-MVS scanner tools are now standalone products, considering the increased use of provisioning and virtualization tools, many will become part of such maturing provisioning and operational enablers during the next three to five years.

What About Data?

Although many users can see the end of and decommissioning of legacy codes, the question of how to maintain and recover data poses a more complex and often long-term planning-and-solution approach. For many legacy applications, the data has another somewhat active five to 10 years (the majority of the time archived [more than 90 per cent], but part time [less than 10 per cent] active) — and in five per cent to 10 per cent of the cases — a 20-year life (a challenge in itself), where the data will have to be used with expanding experiential data stores. The simple fact is that the data (or mechanisms supporting the data) will have to remain intact, and in the case of database systems, “smart” for extended periods. Although some (less than 10 per cent) VSAM data and lesser amounts of IMS have been converted to Unix or Intel file systems, the vast amount of retired data has been put out to archived pasture. Therefore, users will have to retain procedures, tools, and access methods for periodic data resuscitations using traditional VSAM and IMS data mover-like tools from BMC, Princeton Softech, and CA. It should be mentioned that just keeping tools “around” like those previously mentioned (especially if sunsetting enables a mainframe system to be removed or reduced in capacity) can cause asset management challenges, because software vendors will still want maintenance and potential upgrade costs if legacy loads are increasing.

Although sunset VSAM and IMS legacy systems will generally exploit established onboard tools, database systems and structures destined for sunsetting will force the acquisition of new tools. Because most DB2 applications are fairly new in comparisons to many legacy systems (20 to 30+ years old) and growing 40 per cent+ annually, and other new applications may use DB2 as the data store, portfolio and sunsetting planners will have to incorporate methods and tools that will enable the older data to be used with or compared to new information. What this points to are tools that can be used to initially archive and sunset data. Equally important, they must be able to restore older data and often massage it to fit newer schemas — no small task. Two legacy archive and recovery vendors are IBM with its maturing product and Princeton Softech, which to date is been setting the standards for flexible database archiving. Although most users are exploiting such tools for “archival,” the restore part will grow in direct importance to data volume.

Business Impact: Projected savings from the decommissioning and sunsetting of legacy systems must include one-time costs for interapplication discovery and disengagement and ongoing costs of sunset data stewardship.

Bottom Line: Effective decommissioning and legacy application sunsetting will require extensive use discovery, data and database archiving, recovery tools, and strong data skills for 10+ years.

Copyright

Exit mobile version