Hadoop solidified for production duty

NEW YORK — After nearly seven years of development and fine tuning, the Apache Hadoop data processing framework is finally ready for full production use, the developers of the software announced Wednesday.
 
The project team behind Apache Hadoop has released version 1.0 of their platform. “Users can be much more confident that this release will be supported by the open source community,” said Apache Hadoop vice-president Arun Murthy. “There is no more confusion over which version of Hadoop to use for which feature.”
 
Three new additions in particular helped make this release worthy of the 1.0 designation, Murthy explained. End-to-end security is the chief feature. Hadoop can now be secured across an entire network, using the Kerberos network authentication protocol. As a result, enterprises can now trust their Hadoop deployments with sensitive and personal data.
The second feature, the webhdfs REST API (representational state transfer application programming interface), can be used to interact with Hadoop using Web technologies that many administrators and programmers easily understand, making Hadoop more applicable to more organizations. Finally, this version is the first to fully run HBase, which gives administrators a familiar relational database-like structure to store their data.

Lucene developer Doug Cutting, along with Mike Cafarella, created Hadoop in 2005 as an implementation of Google’s MapReduce algorithm, a technique for analyzing data spread out across many different servers. Cutting would later go on to work for Yahoo to help the portal company use the technology to aid in its search service, an implementation that was eventually spread across over 40,000 servers.
 
Hadoop can be used to store and analyze large data sets, often called big data. Although originally designed for aiding large search services, the technology is increasingly finding a home within enterprises as well, Murthy said. The project has at least 35 code committers, and hundreds of other contributors.

Using Hadoop for data analysis can be handy for data sets that are too large for traditional relational databases, or in cases where the organization collects lots of data but doesn’t know yet what analysis needs to be done on that data. Financial services conglomerate JPMorgan Chase & Co. uses the technology for fraud detection and risk management. EBay is using the technology to build a new search engine for its auction service.

The technology has also recieved a lot of commercial support. Startups Cloudera, Yahoo-spinoff Hortonworks and MapR all offer commercial distributions of the software. IBM Corp.  [NYSE: IBM] has incorporated Hadoop into its InfoSphere BigInsights data analysis package, and Microsoft Corp. [Nasdaq: MSFT] has a copy of Hadoop running on its Windows Azure cloud service.

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now