Data quality in the enterprise has long been solely about accuracy and real-time access, but a university professor is pushing a new model that better reflects today’s realities, and is making available benchmarks for businesses to determine how financial performance is affected.
“That’s so nineties kind of thinking, my gosh,” said Anitesh Barua, a distinguished teaching professor and lead researcher at the University of Texas in Austin, of traditional data quality models.
Barua has augmented the old data quality model with new attributes like intelligence, remote accessibility and sales mobility (data accessed through sales apps). He then measured 150 global Fortune 500 companies along these attributes as well as their overall performance to create a series of charts and graphs with which companies can measure certain financial metrics that are key indicators of competitiveness, health and profitability.
“Back then if we had accurate data that was good enough. Now we’re asking if there is intelligence in it … is it data or is it actionable information?” said Barua.
The study found that certain data quality attributes have a direct impact on employee productivity, return on equity, return on invested capital, and return on assets, said Barua.
For instance, the report offers guidance such as if a business improves a particular data attribute by X per cent, then the expected benefit with respect to return on assets is Y per cent.
Barua said the advantage of tying data quality to financial metrics is it attracts the attention of the right people. “If we want senior management to be really involved in this stuff — the CEO, CFO, COO — to take notice, we have to demonstrate that when we do these things, ultimately it’s felt at the enterprise level,” he said.
The study also looked at the degree to which each data attribute had been developed in terms of importance and awareness among respondents. Least well-developed was intelligence of data, a finding not surprising to Barua who said data intelligence is much more broad than the narrow sales and customer focus that businesses afford it.
Remote accessibility of data was the most well-developed, considering a workforce that is increasingly mobile and remote.
Barua said he focused the study on Fortune 500 companies because they supposedly manage their companies well. But he found that “there is still an issue” with data quality even among that group. “Intelligence (of data) is low, but the effect of intelligence is big,” said Barua, referencing how less-developed that data attribute was found to be.
He thinks data quality is likely affected by, among other things, a perception that deploying the right technologies for data management will cost some staff their jobs. “There is a mindset problem when they see IT as a substitute (for) labour,” said Barua.
A study earlier this year by Ventana Research Inc. identified contributors to poor data quality that included discrepancies in data formats, IT systems housing multiple versions of the same data, and flawed business processes that relay data across an organization.
Building a blueprint that maps out how data sources relate to IT systems and how data flows between stakeholders is useful to determining where the disconnects lie, said Mark Smith, CEO & executive vice-president of research with the Pleasanton, Calif.-based firm.
The study by Ventana Research found that 57 per cent of organizations polled are still trying to make sense of their data issues.
Follow Kathleen Lau on Twitter: @KathleenLau