Don’t get carried away by Hadoop’s ‘gee-whiz’ factor

Companies should take a pragmatic approach to implementing Hadoop for their “big data” requirements, a new report released Tuesday by analyst firm Forrester Research urges.

The guidance is based on the experiences of some early adopters of Hadoop, including Yahoo, AOL, Mozilla and Klout. It cautions companies against getting carried away by the hype surrounding the technology and advocates a staged Hadoop deployment driven purely by business goals.

Avoid Hadoop “science projects” that lack business value, the report noted. “Be careful not to mistake Hadoop’s technological gee whiz factor with a genuine business case that delivers real business value.”

Hadoop is an open source technology designed to help companies manage and process extremely large volumes of data. Much of Hadoop’s growing popularity lies in its ability to break up large data sets into smaller data blocks that are then distributed across a cluster of commodity hardware for faster processing.

Early adopters have been using Hadoop to store and analyze petabytes of unstructured data that other enterprise data warehouse technologies have not been able to handle as easily. The fact that it is an open source technology with growing vendor support has greatly contributed to its growing allure within enterprises.

“Hadoop is the Linux of enterprise data warehouses,” said James Kobielus, a Forrester analyst and one of the co-authors of the report. “It is being adopted and forked and tweaked and optimized,” in much the same way that Linux was in the early days, he said.

Hadoop holds significant promise for enterprises, especially in cloud -based environments Kobielus said. “It was architected from the get go to be more than a data warehouse. It was architected from the start to handle huge volumes of complex, unstructured content and not just structured, relational data like conventional relational database technologies are, he said.

Even so, the technology is still a work in progress, he said. Hadoop’s relative newness, its immaturity, lack of standards, relative dearth of commercial offerings and skills availability, all pose big challenges for enterprises, he said.

The Forrester report outlines a set of best practices that Kobielus said companies can use to guide their Hadoop deployments.

One recommendation is for companies to align their Hadoop initiatives with a clear big data business strategy, Kobielus said. IT managers need to identify specific business situations where Hadoop’s capabilities can deliver clear benefits. Yahoo ‘s business case for using Hadoop for instance, has always been to support analytics on very large data sets for ad placement purposes, Kobielus said.

Companies need to have similar specific goals when implementing Hadoop. In most cases, it is best to start with projects that have near-term benefits and easily tangible impact. Before deploying Hadoop, it is also best to see if other enterprise data warehouse technologies can handle the requirements.

“Explore a big data approach like Hadoop only if your data volumes are likely to scale into the high terabytes or even petabytes,” level and include a wide variety of data types, the Forrester report noted. “If you overinvest in data storage , computing, and networking capacity, you’ll add to your cost overhead without any concomitant business benefit,” it warned.

Forrester also recommended that companies use enterprise-grade platforms and tools for powering their Hadoop environments. Where possible, companies should evaluate the functionality, scalability and maturity of the commercial offerings and tools that build on and extend the Apache Hadoop open source distribution.

Some examples include Hadoop distributions from vendors such as Cloudera, EMC Greenplum, IBM and Hortonworks, the report said. It’s unwise to attempt to build an in-house Hadoop stack when there are commercial options, Forrester said.

It’s also unwise to completely standardize or commit to one vendor’s distribution at such an early stage because it is unclear how the market will evolve over the next few years, Kobielus said.

Companies would also do well to build their own Hadoop centers of excellence, Kobielus said. Given the severe shortage of skilled Hadoop professionals, companies should try and develop Hadoop skills in-house as much as they can.

The focus should be on upgrading the skills of database administrators and integration specialists and getting them familiar with technologies such as MapReduce. They should also connect with the broader Hadoop community to keep abreast of best practices and trends, Kobielus said.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now