Why a storage agnostic enterprise approach can offer greater flexibility across data protocols

NAS or SAN? Flashing or spinning disk? Data storage choices used to be simpler for enterprises — but in the era of cloud, edge networks and big data, they are grappling with more options to store their data and making sure it gets moved to the best media to meet the needs of applications.

As Primary Data found in a study it released last month, organizations are increasingly managing more storage systems and multiple vendors, a situation the company set out to make easier. This week at EMC World, it announced that its storage-agnostic DataSphere platform can unite EMC systems including ScaleIO software-defined scale-out block storage and Isilon scale-out NAS, as well as Amazon S3 cloud storage.

Primary Data CEO Lance Smith said this data virtualization allows enterprises to integrate the capabilities of their diverse storage resources across file, block, and object access protocols so they can scale their EMC Data Lake across different storage platforms from that vendor.

Smith said DataSphere is able to move live data based on service level objectives across different storage systems using different protocols, including ScaleIO software-defined scale-out block storage, Isilon scale-out file-based storage, and object-based cloud storage such as Amazon.

The DataSphere architecture separates the data path from the control path so that applications are pointed to the storage asset that makes the most sense based on whether the data is hot or cold. IT is able to define application performance, protection and price requirements. As data becomes hot or cold, said Smith, DataSphere makes sure it is placed automatically on the storage platform that best meets evolving objectives without application disruption.

Hot data would go on local flash storage, for example, while a file that hasn’t been used in some time, which is now considered cold enough to be deemed frozen, said Smith, could be moved to cheap cloud storage. “We send it out to any object store that anyone wants,” he said. “If someone pushes their entire infrastructure into the cloud, we are still applicable in that scenario.”

Smith said the challenge is getting enough metadata to make some automated decisions so that IT can manage the organization’s data without throwing a lot of resources at it.

Henry Baltazar, research director for the Storage Channel at 451 Research, said the value proposition of Primary Data is not in and of itself the ability to differentiate data, but that once they have that data they are able to move it around, a feature that used to be only available within a single storage array.

It’s a problem that’s been around for more than a decade, he said. “A decade ago, we had disk and faster disk. The performance differential wasn’t a big deal.” But with emergence of flash as a faster tier and availability of cheap, slower cloud storage, there’s more impetus to move data around. Even still, the cost of moving data is expensive.

Baltazar said Primary Data’s ability to be storage system agnostic is significant, given that many organizations end up with multiple vendors in their environments, in part due to merger and acquisition activity. Just because one company buys another, he said, it doesn’t mean they will rebuild their storage infrastructure.

And while some organizations do buy all of their storage from one vendor, Baltazar said most companies will have multiple types due to innovation over time, and generally multiple vendors can be found in most enterprise data centres. “Object storage is very new. You may not buy EMC even if you already have EMC.”

It could give enterprises more power when making purchase decisions because they would be able to play one vendor off another, Baltazar added. “Primary Data could be the referee.”

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Gary Hilson
Gary Hilson
Gary Hilson is a Toronto-based freelance writer who has written thousands of words for print and pixel in publications across North America. His areas of interest and expertise include software, enterprise and networking technology, memory systems, green energy, sustainable transportation, and research and education. His articles have been published by EE Times, SolarEnergy.Net, Network Computing, InformationWeek, Computing Canada, Computer Dealer News, Toronto Business Times and the Ottawa Citizen, among others.

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now