Site icon IT World Canada

Proactive data management crucial to IT project success

COMMENT ON THIS ARTICLE

High quality data is crucial to good IT management.

This fact is finally being recognized, and the recent dramatic consolidation in the “data quality” market is proof of that.

For the most part, enterprises have learned about the cost of poor data quality the hard way, through repeated failures of major IT projects, cost overruns, and schedules gone awry.

A proactive approach

Faced with mountains of dirty data, disparate systems, tight budgets, and long lists of business requests, many companies that want to improve the quality of their data often do not know where to start.

That’s one reason why, in the past, many firms only focused on the issue when faced with failure in a huge IT project due to poor data quality.

Today, enterprises can adopt a proactive rather than a reactive approach to data management.

Tools for data profiling and data discovery are available to avert risks arising from “dirty data.” Companies are using them to more precisely scope out large data projects.

It may not especially matter which data you start with. What does matter is when you start thinking about data quality. It pays to plan ahead, and not just for today’s data and technology requirements but for what users will need down the road.

Taking advantage of external rules-based data quality processes, companies are reusing work done on one project to launch others. Even though rules may be modified and refined from one project to another, this kind of reuse still saves time and money in implementation. It also promotes consistency across projects and systems.

A single data quality initiative can be transformed into a wider data governance program, where rules and metadata associated with various projects can be administered centrally.

Multifaceted data

Another change is in the types of data companies want to improve and maintain. As businesses shift from product-centric data management to a more customer-centric design, even customer data has come to mean far more than names and addresses.

Today’s data is also less likely to be a single, monolithic record of information delivered to all users and systems. Instead it is becoming more multifaceted, serving various business purposes – often simultaneously – in several operational, analytic, and reporting applications.

The “single view” that businesses often want obscures the complexity of business purposes that such data must serve.

Some forward-looking companies are now using data quality enhancement processes across a range of enterprise systems, applications, and business processes.

The case for consolidation

The current interest in Master Data Management (MDM) and Customer Data Integration (CDI) reflects the interest in consolidating and centralizing data stores. Pressure to build business processes that span applications in a service-oriented environment is driving these large integration initiatives. It is also clear that compliance and governance are important factors when planning such projects.

One dramatic change relates to the demand for real-time connectors between systems to support broader data quality practices. Such connectors – to multiple customer relationship management (CRM), enterprise resource planning (ERP), and supply chain management (SCM) systems – help companies maintain complete customer and supplier lists and track inventory.

To help organizations ensure new data meets standards they have established for existing data stores, data quality processes are being built into Web transactions. Data warehouse refreshes – currently more likely to be daily than weekly or monthly – regularly include data quality processes, either at the source systems or at the warehouse.

Data Quality Services

As data quality evolves into a service accessible to all systems, applications, and SOA business processes, companies have multiple options for accessing and using such services. Best results may be achieved by embedding these services in the business process itself. Data quality then becomes part of standard and specific business processes, for purposes such as fraud detection, for example, and for tracking and managing RFID data.

Large multinational enterprises may choose to have their own enterprise data quality service. This type of service will behave like a complex adaptive system. It will be capable of learning, of detecting new conditions in the data, responding to new business requirements, and of adapting far more automatically than our systems do today.

At the other end of the spectrum, small and medium-sized companies especially, will access data quality on demand.

In this configuration, application service providers offer hosted software services. They will slice and dice data quality functionality and package it to serve specific, widespread needs.

In either case it seems clear that data quality is poised to enter the mainstream of information technology, where it will play a critical role, ensuring that good, reliable, useful information is available to users throughout the organization.

Len Dubois is the Vice President of Marketing for the Trillium Software division of Harte-Hanks LLC. He has authored numerous articles on data quality and CRM and presents frequently at national and international conferences. Dubois can be reached at ldubois@trilliumsoftware.com.

COMMENT ON THIS ARTICLE

Exit mobile version