tooling up for Data Migration

Even before the arrival of the new millennium, Y2K has already adversely affected many organizations by diverting millions of dollars away from high-priority system projects. Many data migration projects have been pushed to the back burner, and firms have delayed ERP implementations and data warehousing projects that are dependent on their migration efforts. Compounding the problem is the adoption of a single currency standard by the European Monetary Union (EMU). The price tag for conversion to the euro is expected to range between US$150 and $400 billion for the eleven EMU nations alone.

On the surface, these issues appear to be vastly different. Y2K is all about fixing date fields, while the euro issue revolves around a mishmash of multiple currency conversions. Data migration deals with moving data from older systems and non-relational formats to newer systems capable of leveraging relational database forms like Structured Query Language (SQL) with the latest client/server and web-based applications.

However, whether you’re talking about Y2K compliance, euro conversion, or migrating huge non-relational data sets into SQL tables, by recognizing that each issue is primarily about fixing data – not programs or systems – you take a major step toward solving all three.

“Knowing your data” is the key to untangling legacy archives, whether they reside on mainframe, mini, or UNIX systems, or a combination of all three. And automated data management tools offer the last real opportunity to still achieve Y2K compliance against a very short deadline, while aiding in ongoing SQL data migration efforts that will also require a massive amount of data analysis and testing prior to conversion.

What some IT people still don’t realize is that the synergy between the different issues means that they can apply the same data management tool, preferably an automated one, to each problem.


The real problem with many Y2K compliance efforts lies not in the ability of programmers to fix old program code, but in the enormous amount of time necessary to manually analyze and remediate decades of glitches scattered across terabytes of data. The countless hours required for analysis, testing, and conversion of every date, date calculation, and embedded business rule ever recorded by your enterprise is the primary reason that many organizations won’t cross the finish line in time.

Creating a euro-compliant system bears a great deal of resemblance to building a Y2K-compliant system. Programs and data have to be analyzed, tested, and converted. Compounding the problem is the reality that a three-year phase-in period will result in significant gaps between accounting systems during the transition. Not only will straight conversions to national currencies or the euro be necessary, but also dual conversions to both currencies, and vice versa, depending on where the system on the other end of the transaction is in their euro-compliance process. Again, the problem centers on finding ways to deal with critical data accurately, efficiently, and expediently.

The last issue, data migration, reared its head when the euro was still just a twinkle in a few European leaders’ eyes. At roughly the same time, the IT industry was discovering the value of relational databases for enterprise-level applications, and also coming to the realization that time was running short on Y2K.

Despite the funneling of funding away from the migration projects because of Y2K, data migration hasn’t disappeared off IT radar screens or lost any of its fundamental business value, and moving enterprise data into the more efficient and functional SQL format remains a top priority. Once again, evaluation, testing, and remediation of your enterprise archives is necessary, as the data needs to be groomed into a common format before it can be converted into SQL tables.

Obviously, data lives at the center of each of these concurrent issues. The question is, can you empower your IT staff with tools that help them meet their deadlines for handling the daunting task of data management behind each project while operating within the existing IT budget?


The Y2K deadline has left many IT departments short on options; outsourcing your data management to a “data mill” is increasingly expensive, yet few IT shops have enough in-house manpower to manually accomplish the task on their own. In this light, the need for an automated tool that speeds up the data handling aspect of the Y2K conversion process, as well as related euro conversion and SQL migration projects, is clear.

“In the face of the millennium bug, IT shops have done some strange things to their data files,” said Michael Cohn, former head of IBM’s Y2K consulting efforts in the Southeastern U.S. “Windowing and other date manipulations that circumvent the problem are stop-gap measures aimed exclusively at beating the January 1st deadline, but they aren’t the complete answer.”

Cohn believes that an automated data management tool is the one thing that can help sort the data out as part of a larger remediation process, at the same time that the organization is addressing Y2K and euro compliance. Programmers can quickly and easily restore data integrity without hours of manual processing, and most importantly, the data can also be moved into a common format to ease any potential migration efforts.

“In many ways, you can break the computer industry into three eras,” Cohn continued. “In the seventies, everything was about hardware: what kind of machine do you have? What’s its processing power? By the late eighties, it was about networks and systems working together. In the nineties, we’ve finally come around to realizing everything is about the data that all those systems store, transfer, and process.”


A recent article in PC Week (November 9), Betwixt and Between Y2K and the Euro?, suggested that current year 2000 project management teams could be left in place to deal with the fundamentally similar euro conversion issue. While this is sage advice, the article missed suggesting the next logical step: that the low-level task of data migration from old flat file formats into relational tables can also be effectively handled by the same team.

Incorporating all three projects can simplify things for programmers and technicians alike. Instead of two or three teams, a single Project Management Office (PMO) can steer a unified data remediation team toward organizational objectives.

The potential for exploiting synergies between similar project functions like system inventory and assessment, configuration management, and testing, serves as a powerful incentive for creating a central PMO. By keeping IT staff already familiar with the management organization and its plans in contact with the other projects, retraining and project ramp-up delays can be virtually eliminated.

At this late stage in the Y2K game, the role of an automated data management tool is explicit. Far more cost-effective than outsourcing, it can be an invaluable addition to the testing arsenal of an IT shop that’s already converted it’s executable code but is falling behind because of the sheer scale of their data conversion problem.

It’s also clear that companies have no choice but to address Y2K and euro-compliance issues as quickly as possible, while data migration patiently waits for its inevitable return to the top of the priority list. A myriad of other data-centric issues are also waiting in the wings.

Dr. Frank Cullen is a database expert and co-founder of the Atlanta-based IT consulting firm Blackstone & Cullen.