IT’s Big Data Blunder

Many organizations are pouring in millions of dollars on tools and data scientists but many continue to struggle in turning up any meaningful returns from their big data initiatives.

Behind this big data debacle, according to an article in the Harvard Business Review, is a common blunder by many IT departments to treat their analytics and big data projects just as they would other IT projects such as an enterprise resource planning (ERP) or customer relations management (CRM) installation.

Traditional project planning methods typically involve establishing requirements and tech specifications upfront as well as building and deploying technology based on pre-conceived plans and delivering it on time and on budget.

This method may work well with big data project when an organization’s goal is to boost efficiency, reduce expenses and improve process. However, despite delivering these benefits, according to authors Donald Marchand and Joe Peppard, their research has shown that many business executives remain unsatisfied with the results.

The gaffe they said lies in IT’s failure to determine how to effectively use the information generated by their big data project so that users can make better decisions or to abstract unexpected insights that could be useful to the business.

For instance, a common notion in many enterprise organizations is that big data projects are meant to provide high quality information to executives and managers so that they could make better decisions.

This line of thought does not take into account that many people, including managers and executives, have biases that often hinder their decision making capabilities.

Marchand and Peppard recommend that users themselves, rather than just their managers, be given access to the information turned up by big data programs. It is the users who can create meaning out of the information.

Most IT systems are designed on the assumption that data involved has already been deemed important. Such a method may be useful in highly structured tasks which require precision, however people do not think in a vacuum such an approach fails to recognize that people organize, interpret and use information based on their experiences, their own mental models and needs.

This conventional IT approach tends to fall apart when requirements call for the data to be “taken out of the technology domain” and brought into the “human domain.”

To find out more, click here

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Nestor E. Arellano
Nestor E. Arellano
Toronto-based journalist specializing in technology and business news. Blogs and tweets on the latest tech trends and gadgets.

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now