Survey: There’s too much data, yet there’s also too much storage capacity

While organizations – and end users – are drowning in data, enterprises are also swamped in storage, both in terms of over-capacity and types of database platforms.

At least that’s the conclusion of Primary Data, which did a small survey of enterprises worldwide, confirming the data deluge, said CEO Lance Smith, but also that most organizations have twice the storage they actually need and some are overprovisioned as much as five times above what data they actually have. “Half of their storage is running idle.”

Smith said the standard approach is to match specific apps to specific types of storage with capacity in advance to cover their SLAs. This leads to a data siloing effect, and when you have a range of storage platforms available, including flash arrays, traditional spinning disk, SANs and cloud-based storage, that effect gets magnified.

It’s not just multiple types of data storage, but most enterprises have more than one vendor, according to the Primary Data study. About one-third of organizations are managing more than 20 different types of storage devices, Smith said, generally because they have been around a while and have been buying different systems over the years as well as accumulating others through corporate acquisitions. Just over half of enterprises manage 10 or more storage technologies.

“No one said just one,” Smith said.

The Primary Data study found that nearly two-thirds of enterprises manage storage from three to nine different vendors. Ultimately, this makes storage management complex, said Smith, something he learned as one of the founders over at Fusion-io, which was ultimately bought by SanDisk.

“We learned an interesting lesson,” he said. “We had to manually configure everything to work. It became a complex storage management problem.”

Primary Data’s approach to solving this complexity and system sprawl is not by managing the data itself, but managing meta data. It virtualizes data and puts it in the right type of storage based on its attributes and use case.

“A file can be promoted or demoted to whatever storage makes sense,” he said. The approach is really a natural extension of the early days of flash being added to storage arrays and determining which data needs to be on the faster media.

It allows enterprises to load balance across capacity and is vendor agnostic, and in the process, Smith said, they will gain a better understanding of what storage technologies they are managing and can begin a process of spring cleaning, although many organizations today are keeping all data, he said. “Nobody is deleting data. Facebook is treating everything as if it were enterprise-class data.”

Another lesson Smith learned at Fusion-io was that about 80 per cent of corporate data was cold, while the other 20 per cent was very hot and usable. “Customers don’t know what’s hot and cold.” As part of the spring cleaning process, enterprises can get a better understanding what data isn’t hot and push it to cheaper storage, such as the cloud, and slowly decommission old storage arrays by moving data over time without disrupting applications.

Another reason why organizations end up with so many different types of storage in their infrastructure is that decommissioning old storage is difficult to do, said Smith. “It can take a year of planning.” Even new equipment deployment takes time – configuration alone takes a few weeks, and ultimately, overprovisioning occurs because it takes time to test and roll out a new product, and IT wants to make the storage selected for any particular application has enough capacity and leave it, he said. But that means no one else has access to that overprovisioned capacity. “Every one has gotten used to that.”

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Gary Hilson
Gary Hilson
Gary Hilson is a Toronto-based freelance writer who has written thousands of words for print and pixel in publications across North America. His areas of interest and expertise include software, enterprise and networking technology, memory systems, green energy, sustainable transportation, and research and education. His articles have been published by EE Times, SolarEnergy.Net, Network Computing, InformationWeek, Computing Canada, Computer Dealer News, Toronto Business Times and the Ottawa Citizen, among others.

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now