“Stale data” causing fresh storage headaches for organizations

Think about your enterprise storage environment and the sheer volume of unstructured data created and managed by staff on a daily basis. Now consider the notion that a large chunk of your corporate information is going stale and unused, according to a recent report.

Mountain View, Calif.-based IT storage vendor Veritas Technologies this month noted that 41 per cent of enterprise data — think rich content such as text files, videos, presentations and spreadsheets — hasn’t been touched in three years.

And it’s more than just a “nice-to-know” — dealing with large volumes of unused data only makes it that much harder if there is ever a security breach, particularly when a fast IT response time to a malicious data incident is critical.

Touted as a way for businesses to rethink data management, the company has unveiled the Data Genomics Index. The crowd-sourced report analyzed “tens of billions of files and their attributes” from customers’ unstructured data environments. The report aims to function as a “comparison standard” by offering accurate insights into today’s enterprise data environment.

More than 8,000 of the most popular file type extensions were considered in the inaugural Data Genomics Index. The report findings reveal:

  • The real speed of data growth at the file level over the past seven years is 39.2481189% year-over-year
  • 91 per cent more text files are created in the fall than any other season
  • Organizations can reduce their storage costs by 50 per cent by making a deletion or archive strategy a priority
  • Image, videos and developer files are taking up the most storage space in corporate environments

Stale enterprise data is a common issue for storage admins as they struggle to maintain an effective archival strategy. In particular, orphaned data — content without an attributed owner, either through role changes or employee departures — can be costly from a data storage perspective as it tends to be out of sight and out of mind.

Customers are struggling with two competing forces of nature – the exponential data growth curve, and the restriction of resources and budget to fight it with new servers and applications, said Veritas CTO Steve Vranyes in a statement.

The average corporate storage environment has approximately 10 petabytes of data — and conducting an archive project to cull this information could return as much as $2 million a year in storage savings, the company claimed. 

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Ryan Patrick
Ryan Patrick
Seasoned technology reporter, editor and senior content producer.

Featured Articles

Cybersecurity in 2024: Priorities and challenges for Canadian organizations 

By Derek Manky As predictions for 2024 point to the continued expansion...

Survey shows generative AI is a top priority for Canadian corporate leaders.

Leaders are devoting significant budget to generative AI for 2024 Canadian corporate...

Related Tech News

Tech Jobs

Our experienced team of journalists and bloggers bring you engaging in-depth interviews, videos and content targeted to IT professionals and line-of-business executives.

Tech Companies Hiring Right Now