ILM step by step

Hardware and software vendors like to define information lifecycle management – ILM – in terms of their own products. The fact is, however, that ILM is simply a philosophy of ensuring that the infrastructure chosen for the storing of data is periodically aligned with and thus relative to the actual business value of the data throughout its life.

This approach to ILM can be based on a reference architecture that provides tiers of service of differing capabilities and thus related costs. But how will a move from tier to tier be triggered? Everyone talks about ILM, but it seems most implementations are in fact DLM strategies – where D is for data. What is the difference between data and information? How is the value of data calculated? When is the delta in value sufficient to trigger a migration? Are different strategies needed for structured data, and unstructured data? Is ILM in fact application specific? What is the impact of ILM-driven data migration on recovery capabilities? Will the cost of administering and controlling the ILM environment chew up the projected savings?

Let’s briefly look at the questions we need to ask of ourselves and the vendor before embarking on a journey towards ILM.


Just what is the difference between data and information? How does it affect lifecycle management? Data generally needs rendering by an application before the result can be utilized. Information is data that has been rendered into a human actionable form. What are the implications?

Valuing Data – $$$ or dreams

This is an area where many vendors take the broadest definition of value and use subjective measures such as “importance.” If movement of data is to be automated and based on change in value, then value needs to have some empirical base. This empirical base also needs to be capable of dynamically reflecting current “value” to allow continuous monitoring for change.

Enough change is enough?

Let’s assume that we have been able to calculate and subsequently dynamically maintain the value of data over time. Now we need to ask ourselves what delta in change would be sufficiently significant to trigger migration to a more appropriate tier of storage. This is perhaps one of the few easy ILM questions to answer as we have already learned how to do this when we set up the service provider model.

ILM – camouflage for archiving and HSM

Even though it may not be possible to effectively value data dynamically enough to trigger automated migration, some ILM migrations can be auto triggered by non-value-based criteria such as date, age and other standardized metadata components; however, it is difficult to view this activity as anything other than archiving in an HSM environment.

Once the issues above are well understood, the implications of these issues are apparent and an understanding of the cost benefit of ILM has been agreed on, a CIO can develop a transition plan for data that fits the ILM parameters as follows:

A five-step transition to ILM

1. Determine a cost model that can show savings that will be delivered by moving data from point A to point B on condition X. This allows your policies to be developed based on a rational cost saving basis.

2. Determine your strategic archiving policy. What types of data will move, what will trigger the move (attribute delta), and where will it move to based on the attribute?

3. Determine tactical archiving policies. Define for each specific data type or even data base or file holder, the exact attribute and its delta metric that will trigger a move to a defined tier or location.

4. Attempt to model the frequency of change and the migration time of data along with its impact on production. This will provide a basis for the degree of automation possible and possibly even a justification for investment in automation.

5. Develop standard operating procedures with associated compliance, completion and quality metrics to discipline the operation and place it on a highly repeatable basis, whether it be automated or manually triggered.

But above all else – know what you want, why you want it, what it will cost and what it will save you.

You decide. 061275

Dick Benton is principal consultant with Glasshouse Technologies Inc.

Related Download
Improving the State of Affairs With Analytics Sponsor: SAS
Improving the State of Affairs With Analytics
Download this case study-rich white paper to learn why data management and analytics are so crucial in the public sector, and how to put it to work in your organization.
Register Now