Ransomware attacks on insecure Hadoop systems may be next, say security researchers

Security researchers who this month issued alerts on ransomware attacks on insecure MongoDB and Elasticsearch clusters are now warning organizations running Hadoop their clusters may be the next targets of attackers.

Why? Because they’re easy to get at.

“It looks like 2017 is the year the all low-hanging fruit (open systems) will be saved just in time or perish by ransom attacks,” Victor Gevers of the GDI Foundation, a Dutch non-profit working for an open and secure Internet, said in an email.

An open source distributed file system designed to  store very large files across multiple servers that comes with an analysis engine, Hadoop has been adopted by cost-conscious organizations to handle Big Data problems. Two of the biggest distributions come from Cloudera and Hortonworks. Amazon runs its own version, Elastic MapReduce (EMR), on its Elastic Cloud Compute (EC2) infrastructure. 

But not all enterprise implementations are secure. In fact, a warning note released by Gevers and his colleagues reminds administrators that Hadoop’s default Web interface settings have security and safemode turned off.

“The default installation for HDFS Admin binds to the IP address 0.0.0.0 and allows any unauthenticated user to perform super user functions to a Hadoop cluster,” says the warning. “These functions can be performed via a web browser, and do not prevent an attacker from destructive actions. This may include destroying data nodes, data volumes, or snapshots with TBs of data in seconds.”

As a recent article on Datanami.com pointed out, some organizations just dump log data into Hadoop data lakes for analysis without thinking whether it could include sensitive information. That’s why some experts quoted believe improving the security of a long-term data storage, including data classification, encryption and obfuscation, will increase in 2017.

Gevers estimates there are just over 5,300 Internet-facing Hadoop implementations. It isn’t known how many aren’t secure.

The GDI Foundation offers this advice for securing Hadoop, largely pulled from Apache:

  1. Ensure Security is on.
  2. Turn Hadoop Safemode on.
  3. Turn of service level authentication.
  4. Apply network filtering for or let firewall rules block port 50070 to untrusted IPs.
  5. Add a free IAM control and network segmentation with an OpenVPN solution.
  6. Implement a reverse proxy, such as Knox, to aid in preventing unauthorized access and manage connectivity to Hadoop.

 

 

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Howard Solomon
Howard Solomon
Currently a freelance writer, I'm the former editor of ITWorldCanada.com and Computing Canada. An IT journalist since 1997, I've written for several of ITWC's sister publications including ITBusiness.ca and Computer Dealer News. Before that I was a staff reporter at the Calgary Herald and the Brampton (Ont.) Daily Times. I can be reached at hsolomon [@] soloreporter.com

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now