Big Data has become a focus of a lot of IT discussions in the past two years for a lot of logical reasons: Thanks to virtualization and Hadoop we’re able to store more data than ever, thanks to in-memory databases and advances in compute power we’re able to analyze more data than ever.

But, analyst David Woods argues, that doesn’t mean that a strategy for business intelligence should be to create a huge data store for analysis.

Instead, he says, organizations should think of data in terms of a supply chain — a series of data inputs from a variety of sources and stored in several repositories for analysis.

It follows, he argues in a recent column in Forbes, that preserving the context of data is vital.

Think about it this way: You may have a pile of customer transaction data — when men buy sox they also buy watches — but if you don’t know how old the data is you don’t know how accurate it is.

I’ve simplified the argument, but I hope you get the idea. Context, Woods says, gives data its value for analysis.

He also warns that sometimes in creating a data warehouse for analysis, an organization may have to capture why some data was kept and others excluded.

This sound like this approach may result in organizations having to keep masses of data more than they expect,

However, one of his points is that thinking about data supply chains may increase the value of data warehouses because contextual information will be up to date.

It will take time for an organization to figure out what works, he adds.

You may find his argument persuasive.

Read his full column here