vice-president of New York-based iWay Software, about a small business that built a service-oriented architecture without knowing it.

In 2002, the Data Warehousing Institute estimated that poor data quality cost U.S. businesses more than US$ 600 billion a year. Factor in the rest of the world, plus the accelerating explosion in data generation and gathering since then and the dollar-figure cost of poor data today is probably well upwards of US$ 1 trillion, and climbing.

The ripple effects of bad data in undercutting business efficiency aren’t hard to imagine. It’s even possible to estimate actual dollar-figure costs with a fair degree of accuracy within an individual organization – at least when it comes to customer data.It’s like personal finances. I go for the cheapest goods sometimes. But the cheapest isn’t always the most inexpensive. In the long run they can be the most expensive.Robert Lerner>Text

Reliable figures are far more elusive for data from other operational areas such as finance, manufacturing, procurement or human resources. This kind of information tends to stay hidden much of the time, and its link to overall efficiency and profitability can be hard to demonstrate.

So getting C-level executives to commit serious money to improving the quality and accuracy of data remains a hard sell for the IT manager.

If low-quality data is expensive, improving it costs money too. For small to midsized businesses data improvement efforts are especially challenging as these organizations lack the financial resources and in-house IT talent that large enterprises have at their disposal.

They are also often burdened with relatively steep investments in legacy IT systems that they’re reluctant to abandon for new hardware and software, especially big-ticket items like data warehouses and data mining systems.

“Too many businesses don’t understand how important the quality and accuracy of their data is,” says Robert Lerner, senior analyst for data warehousing and application infrastructure at Virginia-based Current Analysis. “I’ve seen this in companies where they want to do something with their customer data, so they can access it more easily and understand their customers better. CRM sells in a way data quality never will, so the first thing they do is work on a CRM system. Then, if there’s money left over, maybe they’ll put it into a data quality project.”

Even organizations that do seem to understand the importance of data quality often fail to follow through when cost-cutting becomes a priority. Lerner recalls one company that implemented a data quality solution side-by-side with a new CRM system: when budgets got tight the data quality system was first on the chopping block.

Part of the problem is complexity. Data management isn’t nearly as easy to understand as CRM. At the bottom end data management can be a simple matter like a spreadsheet-based reporting tool and at the top end it can encompass automated enterprise-wide data aggregation and analysis systems. Done properly, it also involves redesigning business processes to make optimal use of the data ‘liberated’ by the technology.

Lerner proposes a five-level approach, beginning with data profiling tools and moving through data quality, integration, enrichment and monitoring processes – each with its own category of supporting IT tools – and encompassing legacy data and metadata. Other approaches vary the number of tiers, but cover the same range of processes.

Adding to the challenge for SMBs is the fact that – until recently – the market for smaller organizations had not been well served by technology vendors.

Funding remains hard sell

Getting C-level executives to commit serious money to improving the quality and accuracy of data remains a hard sell for the IT manager.

This forced smaller companies to bite the bullet and opt for databases, data warehouses and other big-ticket technologies designed for large enterprises.

The situation has improved noticeably in the last couple of years, however, according to Jake Freivald, vice-president of New York-based iWay Software. iWay is a data integration services subsidiary of Information Builders Inc. (IBI), a business intelligence (BI) systems vendor.

Freivald believes the improvement in technology offerings has a lot to do with advances in data integration know-how, decreased hardware costs and increased processing power. But technology is still just a tool, and careful planning of business processes and the data required to support them is essential for success and to keep costs from getting out of control. For SMBs, the optimum solution takes advantage of existing IT infrastructure and legacy data as much as possible, to keep at least the up-front costs in line.

“The key is to find the most critical thing you provide and define the business data associated with that,” says Freivald. “In other words, figure out what you want first, then have your IT people focus on creating those services. And hopefully these will be reusable. For example the same customer data is used in a data warehouse but can also be made available for a Web portal lookup and for the help desk or contact center.”

Crawford Adjusters Canada (Kitchener, Ont.), a provider of claims management solutions to insurance companies, followed the look-first model in building a data management system that will provide feedback on service level agreements and key performance indicators to customers and the company’s own field staff. The first staged rollout will begin this summer.

Chief information officer Scott Sutherland and his team started the project last year by asking executives, managers and the business users what they needed in terms of actionable information, and how they wanted it presented. The answer: show us essential key performance indicators (KPIs) and tell us how well we’re doing against service level agreements (SLAs) with customers.

Surprisingly, Sutherland found that ad hoc reporting ability – core of so many data management efforts – was less important to the company’s users than a ‘stock-ticker’ desktop application that runs key metrics across the screen in real time. “In today’s world people don’t want to have to go run a report,” Sutherland says. “They just want to see that number, their KPI or whatever metrics they need.”

Sutherland was happy to skip the reporting tool after he saw the sticker price of an online reporting package. “It had the functionality we needed but the upfront costs were absolutely huge. It was something that a large company like a Dupont or GM could afford but it wasn’t in our ballpark at all.”

Crawford has built up its own IT systems over the years and already had some good technology ready at hand. Now, instead of dropping in an expensive product and setting up his data to work with it, Sutherland is focusing on hooking up back-end business data to get the right information in the right format to the right people. He is also building a new online analytical processing (OLAP) server. The costs the company is incurring lie in that in-house programming activity.

Crawford’s U.S. parent company has given its Canadian subsidiary a lot of autonomy when it comes to information technology. The relaxed attitude stems in part from the fact that the U.S. head office has legacy system issues of its own to deal with. Which is where the build-or-buy debate becomes pointed: if head office decides to circumvent those legacy issues by opting for a brand new off-the-shelf enterprise data management system, will it impose that choice on its subsidiaries?

Sutherland believes the time and money Crawford Canada is

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Andrew Brooks
Andrew Brookshttp://www.itworldcanada.com
Andrew Brooks is managing editor of IT World Canada. He has been a technology journalist and editor for 20 years, including stints at Technology in Government, Computing Canada and other publications.

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now