Data quality vendors missing the mark: Study

There exist many opportunities for vendors in the data quality space to create tools that cover a broader scope, yet many of those opportunities remain untapped, according to an analyst.

Whitepaper: Baylor reduces data center footprint and takes out costs with server consolidation.

Andy Hayler, president & CEO of U.K.-based analyst firm The Information Difference Ltd., said most data quality software vendors focus on traditional niche areas like name and address type tools for customer and supplier records.

But that represents only a small part of end user needs, said Hayler.

“Vendors have opportunities to educate the world about data quality because there is an awful lot of ignorance around this area,” he said.

A recent study on data quality by the Information Difference revealed that respondents view data quality as something that is not restricted to one area within the organization. Instead, two-thirds of respondents said it is an issue spanning the entire organization.

The survey, commissioned by location intelligence vendor Pitney Bowes Insight and data management vendor Silver Creek Systems, polled 200 companies in North America and Europe, almost half of which had revenues of $1 billion.

Specifically, 81 per cent of respondents reported being focused on a broader scope than merely customer name and address data. Twelve per cent said they did in fact hold that slim focus.

Moreover, the survey also found that when placed in order of importance, respondents did not rate customer name address data the highest. Instead, product (inventory) and financial (accounting figures) data were deemed a greater priority.

Yet, the vast majority of tools only focus on customer name and address data, noted Hayler.

“That said to me that there is an opportunity for vendors to address a broader range of data,” he said.

The study also delved into opinions on how data quality fit with master data management (MDM), or the quest for a single version of the truth across an organization’s data.

One-fifth of respondents felt data quality is a prerequisite to an MDM initiative and wanted to see more vendor offerings integrating those two areas.

Hayler said one would expect vendor partnerships between the areas of data quality and MDM, and that is precisely what is currently happening in the industry.

The issue for lack of attention to data quality by MDM vendors, said Hayler, is that traditionally these vendors have focused on building systems that digest data quickly, only to later realize such systems were useless if the data being input was bad.

It’s not just vendors who are failing to grab data quality opportunities. The survey found that enterprises are probably more content than they should be about their data quality.

Fifty-one per cent of respondents believed all was well with data quality, while only a quarter thought it was poor.

Yet, only about a third of respondents had a data quality program in place. Another third had plans to implement a program within the next year to three years.

Among those companies with a data quality initiative, on average three full-time staff are dedicated to the cause.

“People are fairly complacent,” said Hayler. “This complacency is why only a third of companies are really doing anything active with data quality.”

But the discrepancy between the state of data quality and existence of data quality initiatives signals that the perception of the quality of one’s data is not based on fact, said Hayler.

“In many cases, this is a warm, fuzzy perception,” he said.

To further compound this discrepancy is the fact that 42 per cent of respondents had made no effort whatsoever to measure the quality of their data. Among those who had, only 20 per cent were doing so at the enterprise level, rather than by individual department.

Lack of importance placed on data quality, said Hayler, is due to management not seeing it as necessary and the fact that presenting a business case can be tricky.

Hayler said that measuring the cost to the business of bad data quality is a definite bonus to the business case.

“If you are unable to articulate the costs of poor data quality and therefore aren’t able to make a business case to fix it, it is very difficult to get management’s attention,” said Hayler.

Ted Morawiec, president & CEO of e-Net, a Toronto-based performance management system vendor, who recently participated in a CIO roundtable discussion on data quality in Toronto, said measuring data quality can be a little tricky because it entails a human element.

“It is not (just) software and hardware that is the expenditure,” said Morawiec. “It’s the people and the process that it is improving.”

Things like improvements in productivity must be measured, said Morawiec.

It’s essentially business activity monitoring, he said, and is a relatively new concept for executive management to digest.

Hayler said confusion concerning ownership of data and the quality of that data does play a role in getting initiatives off the ground, too, because when data resides in systems, the natural tendency is to think IT owns the data.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now