Site icon IT World Canada

Why searching for the perfect metric is futile

Big data has the potential for altering the calculations by which data management groups buy, manage and structure information storage, but the search for the perfect metric could lead many organizations astray, according to analytics experts.

If companies can find a place for Hadoop in the data warehouse, businesses ought to investigate the possibilities of using big data in areas such as establishing metrics for a “dollars-per terabyte” model, said Philip Russom, director of research in data management with TDWI Research.

Many organizations, he said, may be cautious of moving forward on this because dollar-per-TB metrics are still “fuzzy.”

However, searching for the perfect metric will only bog down projects, according to Russom.

In reality, he said, most metrics are fuzzy anyway. “I think that in this case, it’s better to have a metric even if it’s fuzzy – or less than optimal – than not to have a metric of any kind at all.”

RELATED CONTENT

4 ways to tweak business analytics tools
The big data governance challenge

In a recent TDWI World Conference, Ken Rubin, head of analytics for Facebook, noted that many organizations become obsessed with developing the perfect metric that metrics don’t get used as effectively as they should.

While analysts cannot always do a “perfectly statistically controlled A/B test…there’s always some way to figure out how we’ve improved versus historical trends,” he said. “Let’s use our metrics to narrow it down” to make a decision.

There likely isn’t a golden algorithm, according to Russom. “If we learn nothing from business intelligence and data warehousing, it’s that every organization is very different on collection of sources they have, in-house skills, deployed platforms…and so on.”

He said organizations should take dollars-per-TB as a “bare-bones metric” and create their own algorithms out of it.

Read the whole story here

Exit mobile version